Quantcast
Channel: KitPloit - PenTest Tools!
Viewing all 5854 articles
Browse latest View live

CloudUnflare - Reconnaissance Real IP Address For Cloudflare Bypass

$
0
0

Reconnaissance Real IP address for Cloudflare Bypass.

Preparation:

1. CompleteDNS API
  • Create an account at completedns.com and verify first.
  • Input your email and password on CompleteDNS_Login variable in cloudunflare.bash.

2. Dependencies Needed
  • curl
  • dig
  • whois
Debian Based
apt-get install curl dnsutils whois -y

Installation:
  • Clone this repo
git clone https://github.com/greycatz/CloudUnflare.git

Command:
  • Go to CloudUnflare path
cd CloudUnflare
  • Run
bash cloudunflare.bash

Thank You.
Regards,
@greycatz



XORpass - Encoder To Bypass WAF Filters Using XOR Operations

$
0
0

XORpass is an encoder to bypass WAF filters using XOR operations.

Installation & Usage
git clone https://github.com/devploit/XORpass
cd XORpass

$ php encode.php STRING
$ php decode.php "XORed STRING"

Example of bypass:
Using clear PHP function:


Using XOR bypass of that function:
$ php encode.php system # return A
$ php encode.php ls # return B

payload == A(B)


Why does PHP treat our payload as a string?
The ^ is the exclusive or operator, which means that we're in reality working with binary values. So lets break down what happens.
The XOR operator on binary values will return 1 where just one of the bits were 1, otherwise it returns 0 (0^0 = 0, 0^1 = 1, 1^0 = 1, 1^1 = 0). When you use XOR on characters, you're using their ASCII values. These ASCII values are integers, so we need to convert those to binary to see what's actually going on.
A = 65 = 1000001
S = 83 = 1010011
B = 66 = 1000010

A 1000001
^
S 1010011
^
B 1000010
----------------
result 0010010 = 80 = P

A^S^B = P
If we do an 'echo "A"^"S"^"B";' PHP will return us a P as we see.


Contact
Telegram: @devploit
Twitter: https://www.twitter.com/devploit


LinPwn - Interactive Post Exploitation Tool

$
0
0

LinPwn is a interactive tool created to assist you in post exploitationenumeration and privilege escalation.

Connection
Set your IP and port you want it to connect to in the Connection class.
Place the LinPwn binary on the target machine.
Run nc -lvp PORT on your machine and then run LinPwn on the target machine to get a connection.

Usage
  1. shell - This command Executes /bin/sh
    Example usage: (LinPwn: Shell) > id
    Type exit to return to LinPwn.
  2. readfile - This command will print the contents of a file.
    Example usage: (LinPwn: ReadFile)> /etc/passwd
    Type exit to return to LinPwn.
  3. enumerate - this command runs LinEnum.sh
    Example usage: (LinPwn) > enumerate
  4. download - This command downloads a file on the target machine
    Example usage: (LinPwn: Download) > https://exampleurl.com/file_to_download
  5. wificredz - This command gathers saved wifipasswords
  6. hashdump - This command gathers system password hashes

Compiling
A file called build.sh this is optional run bash build.sh to view the build options if you wish to use them.


Pockint - A Portable OSINT Swiss Army Knife For DFIR/OSINT Professionals

$
0
0


POCKINT (a.k.a. Pocket Intelligence) is the OSINT swiss army knife for DFIR/OSINT professionals. Designed to be a lightweight and portable GUI program (to be carried within USBs or investigation VMs), it provides users with essential OSINT capabilities in a compact form factor: POCKINT's input box accepts typical indicators (URL, IP, MD5) and gives users the ability to perform basic OSINT data mining tasks in an iterable manner.

Installation
You can grab the latest version from the releases page. POCKINT is provided as a single executable that can be stored and run anywhere on computers. POCKINT is available for Windows and Linux platforms.

Features
Why use it? POCKINT is designed to be simple, portable and powerful.
Simple: There's a plethora of awesome OSINT tools out there. Trouble is they either require analysts to be reasonably comfortable with the command line (think pOSINT) or give you way too many features (think Maltego). POCKINT focuses on simplicity: INPUT> RUN TRANSFORM> OUTPUT ... rinse and repeat. It's the ideal tool to get results quickly and easily through a simple interface.
Portable: Most tools either require installation, a license or configuration. POCKINT is ready to go whenever and wherever. Put it in your jump kit USB, investigation VM or laptop and it will just run.
Powerful: POCKINT combines cheap OSINT sources (whois/DNS) with the power of specialised APIs. From the get go you can use a suite of in-built transforms. Add in a couple of API keys and you can unlock even more specialised data mining capabilities.

Credits
Credit goes to the following people for their contributions to the project, either as providers of early feedback/ideas or for their awesome help in spreading the word:


ThreatIngestor - Extract And Aggregate Threat Intelligence

$
0
0
An extendable tool to extract and aggregate IOCs from threat feeds.
Integrates out-of-the-box with ThreatKB and MISP, and can fit seamlessly into any existing worflow with SQS, Beanstalk, and custom plugins.


Overview
ThreatIngestor can be configured to watch Twitter, RSS feeds, or other sources, extract meaningful information such as malicious IPs/domains and YARA signatures, and send that information to another system for analysis.


Try it out now with this quick walkthrough, read more ThreatIngestor walkthroughs on the InQuest blog, and check out labs.inquest.net/iocdb, an IOC aggregation and querying tool powered by ThreatIngestor.


Installation
ThreatIngestor requires Python 3.6+, with development headers.
Install ThreatIngestor from PyPI:
pip install threatingestor
Install optional dependencies for using some plugins, as needed:
pip install threatingestor[all]
View the full installation instructions for more information.


Usage
Create a new config.yml file, and configure each source and operator module you want to use. (See config.example.yml for layout.) Then run the script:
threatingestor config.yml
By default, it will run forever, polling each configured source every 15 minutes.
View the full ThreatIngestor documentation for more information.


Plugins
ThreatIngestor uses a plugin architecture with "source" (input) and "operator" (output) plugins. The currently supported integrations are:


Sources

Operators
View the full ThreatIngestor documentation for more information on included plugins, and how to create your own.


Threat Intel Sources
Looking for some threat intel sources to get started? InQuest has a Twitter List with several accounts that post C2 domains and IPs: https://twitter.com/InQuest/lists/ioc-feed. Note that you will need to apply for a Twitter developer account to use the ThreatIngestor Twitter Source. Take a look at config.example.yml to see how to set this list up as a source.
For quicker setup, RSS feeds can be a great source of intelligence. Check out this example RSS config file for a few pre-configured security blogs.


UBoat - HTTP Botnet Project

$
0
0

A POC HTTP Botnet designed to replicate a full weaponised commercial botnet

Disclaimer
This project should be used for authorized testing or educational purposes only.
The main objective behind creating this offensive project was to aid security researchers and to enhance the understanding of commercial HTTP loader style botnets . I hope this project helps to contribute to the malware research community and people can develop efficient counter mesures :)
Usage of uboat without prior mutual consistency can be considered as an illegal activity. It is the final user's responsibility to obey all applicable local, state and federal laws. Authors assume no liability and are not responsible for any misuse or damage caused by this program.

What is a Botnet ?
https://securityaffairs.co/wordpress/13747/cyber-crime/http-botnets-the-dark-side-of-an-standard-

Features
  • Coded in C++ with no dependencies
  • Encrypted C&C Communications
  • Persistence to prevent your control being lost
  • Connection Redundancy (Uses a fallback server address or domain )
  • DDoS methods (TCP & UDP Flood)
  • Task Creation System ( Altering system HWID,Country,IP,OS.System )
  • Remote Commands
  • Update and Uninstall other malware
  • Download and Execute other malware
  • Active as well as Passive Keylogger
  • Enable Windows RDP
  • Plugin system for easy feature updates

Getting started ?

TODO :-
  • Fix minor panel bugs
  • Make the authentication system more efficient

Project maintained by
Screens :




PESTO - PE (files) Statistical Tool

$
0
0

PESTO is a Python script that extracts and saves in a database some PE file security characteristics or flags searching for every PE binary in a whole directory, and saving results in a database.
It checks for architecture flag in the header, and for the following security flags: ASLR, NO_SEH, DEP and CFG. Code is clear enough to modify flags and formats to your own needs.
More details and flag explanation in here: https://www.slideshare.net/elevenpaths/anlisis-del-nivel-proteccin-antiexploit-en-windows-10

Functionality
The script just needs a path and a tag. The program will go through the path and subdirectories searching for .DLL and .EXE files and extracting the flags in the PE header (thanks to PEfile python library).
The program requires a tag that will be used as a suffix for logs and database filenames, so different analysis can be done in the same directory.
The information provided by the script is:
  • Percentage of .DLL and .EXE files with i386, AMD64, IA64 or other architecture.
  • Percentage of ASLR, NO_SEH, DEP and CFG flags enabled or disabled in the headers.
  • After finishing the analysis it will prompt to export results in a SQL or CSV format.
It will create as well a .db file which is a sqlite file with the information collected.

Related tools


AtomShields Cli - Security Testing Framework For Repositories And Source Code

$
0
0

AtomShields Cli is a Command-Line Interface to use the software AtomShields

Installation
pip install atomshieldscli

Basic usage
ascli <action> <context> --target <path> --name <project_name>
The allowed action values are:
  • install: To install a checker or a report, depending the context setted.
  • uninstall: To uninstall a checker or a report, depending the context setted.
  • run: To run the scan.
  • show: To show a checker list or a report list, depending the context setted.
  • help: Show the help
The allowed context values are:
  • checkers: Operate with checkers
  • reports: Operate with reports
The target option set the path to scan, or the plugin (checker/report) to install/uninstall.

Show all checkers
ascli show checkers


Show all reports
ascli show reports


Install checker
ascli install checkers --target path/to/file.py


Install report
ascli install reports --target path/to/file.py


Uninstall checker
ascli uninstall checkers --target path/to/file.py
or
ascli uninstall checkers --target checker_name


Uninstall report
ascli uninstall reports --target path/to/file.py
or
ascli uninstall reports --target checker_name


Run the scan
ascli run --target path/to/file.py --name repo_name



Virtuailor - IDAPython Tool For Creating Automatic C++ Virtual Tables In IDA Pro

$
0
0
Virtuailor is an IDAPython tool that reconstructs vtables for C++ code written for intel architecture, both 32bit and 64bit code and AArch64 (New!). The tool constructed from 2 parts, static and dynamic.
The first is the static part, contains the following capabilities:
  • Detects indirect calls.
  • Hooks the value assignment of the indirect calls using conditional breakpoints (the hook code).
The second is the dynamic part, contains the following capabilities:
  • Creates vtable structures.
  • Rename functions and vtables addresses.
  • Add structure offset to the assembly indirect calls.
  • Add xref from indirect calls to their virtual functions(multiple xrefs).
  • For AArch64- tries to fix undefined vtables and related virtual functions (support for firmware).

How to Use?
  1. By default Virtuailor will look for virtual calls in ALL the addresses in the code. If you want to limit the code only for specific address range, no problem, just edit the Main file to add the range you want to target in the variables start_addr_range and end_addr_range:
if __name__ == '__main__':

start_addr_range = idc.MinEA() # You can change the virtual calls address range
end_addr_range = idc.MaxEA()
add_bp_to_virtual_calls(start_addr_range, end_addr_range)
  1. Optional, (but extremely recommended), create a snapshot of your idb. Just press ctrl+shift+t and create a snapshot.
  2. Press File->Run script... then go to Virtuailor folder and choose to run Main.py, You can see the following gif for a more clear and visual explanation.

Now the GUI will provide you an option to choose a range to target, in case you would like to target all the binary just press OK with the default values in the start and end addresses.
Afterwards the breakpoints will be placed in your code and all you have to do is to execute your code with IDA debugger, do whatever actions you want and see how the vtables is being built! For AArch64 you can setup a remote gdb server and debug using the IDA debuggger.
In case you don't want/need the breakpoints anymore just go to the breakpoint list tab in IDA and delete the breakpoints as you like.
It is also really important for me to note that this is the second version of the tool with both 32 and 64 bit support and aarch64, probably in some cases a small amount of breakpoints will be missed, in these cases please open an issue and contact me so I will be able to improve the code and help fixing it. Thank you in advanced for that :)

Output and General Functions

vtables structures
The structures Virtuailor creates from the vtable used in virtual call that were hit. The vtable functions are extracted from the memory based on the relevant register that was used in the BP opcode.


Since I wanted to create a correlation between the structure in IDA and the vtables in the data section, the BP changes the vtable address name in the data section to the name of the structure. As you can see in the following picture:


The virtual functions names are also being changed, take aside situations where the names are not the default IDA names (functions with symbols or functions that the user changed) in those cases the function names will stay the same and will also be add to the vtable structure with their current name.
The chosen names is constructed using the following pattern:
  • vtable_
  • vfunc_ the rest of the name is either offset from the beginning of the Segment, this is mostly because most binaries nowadays are PIE and PIC and thus ASLR is enforced, (instead of using the full address name, which is also quite long on 64bit environments). The vtable structure also has a comment, "Was called from offset: XXXX", this offset is the offset from the beginning of the Segment.

Adding Structures to the Assembly
After creating the vtable Virtuailor also adds a connection between the structure created and the assembly as you can see in the following images:


P.S: The structure offset used in the BP is only relevant for the last call that was made, in order to get a better understanding of all the virtual calls that were made the xref feature was added as explained in the next section

Xref to virtual functions
When reversing C++ statically it is not trivial to see who called who, this is because most calls are indirect calls, however after running Virtuailor every function that was called indirectly now has an xref to those locations.
The following gif shows the added Xrefs with their indirect function call:


Former talks and lectures
The tool was presented in RECon brussels, Troopers and Warcon. The presentation could be found in the following link: https://www.youtube.com/watch?v=Xk75TM7NmtA

Thanks
REcon Brussels, Troopers, Warcon crews, Nana, @tmr232, @matalaz, @oryandp, @talkain, @shiftreduce


Gosec - Golang Security Checker

$
0
0

Inspects source code for security problems by scanning the Go AST.

Install

CI Installation
# binary will be $GOPATH/bin/gosec
curl -sfL https://raw.githubusercontent.com/securego/gosec/master/install.sh | sh -s -- -b $GOPATH/bin vX.Y.Z

# or install it into ./bin/
curl -sfL https://raw.githubusercontent.com/securego/gosec/master/install.sh | sh -s vX.Y.Z

# In alpine linux (as it does not come with curl by default)
wget -O - -q https://raw.githubusercontent.com/securego/gosec/master/install.sh | sh -s vX.Y.Z

# If you want to use the checksums provided on the "Releases" page
# then you will have to download a tar.gz file for your operating system instead of a binary file
wget https://github.com/securego/gosec/releases/download/vX.Y.Z/gosec_vX.Y.Z_OS.tar.gz

# The file will be in the current folder where you run the command
# and you can check the checksum like this
echo "<check sum from the check sum file> gosec_vX.Y.Z_OS.tar.gz" | sha256sum -c -

gosec --help

Local Installation
go get github.com/securego/gosec/cmd/gosec

Usage
Gosec can be configured to only run a subset of rules, to exclude certain file paths, and produce reports in different formats. By default all rules will be run against the supplied input files. To recursively scan from the current directory you can supply './...' as the input argument.

Available rules
  • G101: Look for hard coded credentials
  • G102: Bind to all interfaces
  • G103: Audit the use of unsafe block
  • G104: Audit errors not checked
  • G106: Audit the use of ssh.InsecureIgnoreHostKey
  • G107: Url provided to HTTP request as taint input
  • G108: Profiling endpoint automatically exposed on /debug/pprof
  • G201: SQL query construction using format string
  • G202: SQL query construction using string concatenation
  • G203: Use of unescaped data in HTML templates
  • G204: Audit use of command execution
  • G301: Poor file permissions used when creating a directory
  • G302: Poor file permissions used with chmod
  • G303: Creating tempfile using a predictable path
  • G304: File path provided as taint input
  • G305: File traversal when extracting zip archive
  • G401: Detect the usage of DES, RC4, MD5 or SHA1
  • G402: Look for bad TLS connection settings
  • G403: Ensure minimum RSA key length of 2048 bits
  • G404: Insecure random number source (rand)
  • G501: Import blacklist: crypto/md5
  • G502: Import blacklist: crypto/des
  • G503: Import blacklist: crypto/rc4
  • G504: Import blacklist: net/http/cgi
  • G505: Import blacklist: crypto/sha1

Retired rules

Selecting rules
By default gosec will run all rules against the supplied file paths. It is however possible to select a subset of rules to run via the '-include=' flag, or to specify a set of rules to explicitly exclude using the '-exclude=' flag.
# Run a specific set of rules
$ gosec -include=G101,G203,G401 ./...

# Run everything except for rule G303
$ gosec -exclude=G303 ./...

Configuration
A number of global settings can be provided in a configuration file as follows:
{
"global": {
"nosec": "enabled",
"audit": "enabled"
}
}
  • nosec: this setting will overwrite all #nosec directives defined throughout the code base
  • audit: runs in audit mode which enables addition checks that for normal code analysis might be too nosy
# Run with a global configuration file
$ gosec -conf config.json .
Also some rules accept configuration. For instance on rule G104, it is possible to define packages along with a list of functions which will be skipped when auditing the not checked errors:
{
"G104": {
"io/ioutil": ["WriteFile"]
}
}

Dependencies
gosec will fetch automatically the dependencies of the code which is being analyzed when go modules are turned on (e.g. GO111MODULE=on). If this is not the case, the dependencies need to be explicitly downloaded by running the go get -d command before the scan.

Excluding test files and folders
gosec will ignore test files across all packages and any dependencies in your vendor directory.
The scanning of test files can be enabled with the following flag:
gosec -tests ./...
Also additional folders can be excluded as follows:
 gosec -exclude-dir=rules -exclude-dir=cmd ./...

Annotating code
As with all automated detection tools there will be cases of false positives. In cases where gosec reports a failure that has been manually verified as being safe it is possible to annotate the code with a '#nosec' comment.
The annotation causes gosec to stop processing any further nodes within the AST so can apply to a whole block or more granularly to a single expression.
import "md5" // #nosec


func main(){

/* #nosec */
if x > y {
h := md5.New() // this will also be ignored
}

}
When a specific false positive has been identified and verified as safe, you may wish to suppress only that single rule (or a specific set of rules) within a section of code, while continuing to scan for other problems. To do this, you can list the rule(s) to be suppressed within the #nosec annotation, e.g: /* #nosec G401 */ or // #nosec G201 G202 G203
In some cases you may also want to revisit places where #nosec annotations have been used. To run the scanner and ignore any #nosec annotations you can do the following:
gosec -nosec=true ./...

Build tags
gosec is able to pass your Go build tags to the analyzer. They can be provided as a comma separated list as follows:
gosec -tag debug,ignore ./...

Output formats
gosec currently supports text, json, yaml, csv, sonarqube and JUnit XML output formats. By default results will be reported to stdout, but can also be written to an output file. The output format is controlled by the '-fmt' flag, and the output file is controlled by the '-out' flag as follows:
# Write output in json format to results.json
$ gosec -fmt=json -out=results.json *.go

Development

Build
make

Tests
make test

Release Build
Make sure you have installed the goreleaser tool and then you can release gosec as follows:
git tag v1.0.0
export GITHUB_TOKEN=<YOUR GITHUB TOKEN>
make release
The released version of the tool is available in the dist folder. The build information should be displayed in the usage text.
./dist/darwin_amd64/gosec -h
gosec - Golang security checker

gosec analyzes Go source code to look for common programming mistakes that
can lead to security problems.

VERSION: 1.0.0
GIT TAG: v1.0.0
BUILD DATE: 2018-04-27T12:41:38Z
Note that all released archives are also uploaded to GitHub.

Docker image
You can build the docker image as follows:
make image
You can run the gosec tool in a container against your local Go project. You just have to mount the project into a volume as follow:
docker run -it -v <YOUR PROJECT PATH>/<PROJECT>:/<PROJECT> securego/gosec /<PROJECT>/...

Generate TLS rule
The configuration of TLS rule can be generated from Mozilla's TLS ciphers recommendation.
First you need to install the generator tool:
go get github.com/securego/gosec/cmd/tlsconfig/...
You can invoke now the go generate in the root of the project:
go generate ./...
This will generate the rules/tls_config.go file with will contain the current ciphers recommendation from Mozilla.


Dr. Memory - Memory Debugger For Windows, Linux, Mac, And Android

$
0
0
Dr. Memory is a memory monitoring tool capable of identifying memory-related programming errors such as accesses of uninitialized memory, accesses to unaddressable memory (including outside of allocated heap units and heap underflow and overflow), accesses to freed memory, double frees, memory leaks, and (on Windows) handle leaks, GDI API usage errors, and accesses to un-reserved thread local storage slots.
Dr. Memory operates on unmodified application binaries running on Windows, Linux, Mac, or Android on commodity IA-32, AMD64, and ARM hardware.
Dr. Memory is released under an LGPL license and binary packages are available for download.
Dr. Memory is built on the DynamoRIO dynamic instrumentation tool plaform.

Dr. Memory Performance
Dr. Memory is faster than comparable tools, including Valgrind, as shown in our CGO 2011 paper Practical Memory Checking with Dr. Memory, where we compare the two tools on Linux on the SPECCPU 2006 benchmark suite:


(Valgrind is unable to run 434.zeusmp and 447.dealII).

Documentation
Documentation is included in the release package. We also maintain a copy for online browsing.

System call tracer for Windows
The Dr. Memory package includes an "strace for Windows" tool called drstrace.

Obtaining help
Dr. Memory has its own discussion list.
To report a bug, use the issue tracker.
See also the Dr. Memory home page: http://drmemory.org/


Fail2Ban - Daemon To Ban Hosts That Cause Multiple Authentication Errors

$
0
0

Fail2Ban scans log files like /var/log/auth.log and bans IP addresses conducting too many failed login attempts. It does this by updating system firewall rules to reject new connections from those IP addresses, for a configurable amount of time. Fail2Ban comes out-of-the-box ready to read many standard log files, such as those for sshd and Apache, and is easily configured to read any log file of your choosing, for any error you wish.
Though Fail2Ban is able to reduce the rate of incorrect authentication attempts, it cannot eliminate the risk presented by weak authentication. Set up services to use only two factor, or public/private authentication mechanisms if you really want to protect services.

More documentation, FAQ, and HOWTOs to be found on fail2ban(1) manpage, Wiki, Developers documentation and the website: https://www.fail2ban.org

Installation:
It is possible that Fail2Ban is already packaged for your distribution. In this case, you should use that instead.
Required:
Optional:
To install:
tar xvfj fail2ban-0.11.0.tar.bz2
cd fail2ban-0.11.0
sudo python setup.py install
Alternatively, you can clone the source from GitHub to a directory of Your choice, and do the install from there. Pick the correct branch, for example, 0.11
git clone https://github.com/fail2ban/fail2ban.git
cd fail2ban
sudo python setup.py install
This will install Fail2Ban into the python library directory. The executable scripts are placed into /usr/bin, and configuration in /etc/fail2ban.
Fail2Ban should be correctly installed now. Just type:
fail2ban-client -h
to see if everything is alright. You should always use fail2ban-client and never call fail2ban-server directly. You can verify that you have the correct version installed with
fail2ban-client version
Please note that the system init/service script is not automatically installed. To enable fail2ban as an automatic service, simply copy the script for your distro from the files directory to /etc/init.d. Example (on a Debian-based system):
cp files/debian-initd /etc/init.d/fail2ban
update-rc.d fail2ban defaults
service fail2ban start

Configuration:
You can configure Fail2Ban using the files in /etc/fail2ban. It is possible to configure the server using commands sent to it by fail2ban-client. The available commands are described in the fail2ban-client(1) manpage. Also see fail2ban(1) and jail.conf(5) manpages for further references.


Uptux - Linux Privilege Escalation Checks (Systemd, Dbus, Socket Fun, Etc)

$
0
0

Specialized privilege escalation checks for Linux systems.
Implemented so far:
  • Writable systemd paths, services, timers, and socket units
  • Disassembles systemd unit files looking for:
    • References to executables that are writable
    • References to broken symlinks pointing to writeable directories
    • Relative path statements
    • Unix socket files that are writeable (sneaky APIs)
  • Writable D-Bus paths
  • Overly permissive D-Bus service settings
  • HTTP APIs running as root and responding on file-bound unix domain sockets
These checks are based on things I encounter during my own research, and this tool is certainly not inclusive of everything you should be looking at. Don't skip the classics!

Usage
All functionality is contained in a single file, because installing packages in restricted shells is a pain. Python2 compatibility will be maintained for those crap old boxes we get stuck with. However, as the checks are really aimed at more modern user-space stuff, it is unlikely to uncover anything interesting on an old box anyway.
There is nothing to install, just grab the script and run it.
usage: uptux.py [-h] [-n] [-d]

PrivEsc for modern Linux systems, by initstring (github.com/initstring)

optional arguments:
-h, --help show this help message and exit
-n, --nologging do not write the output to a logfile
-d, --debug print some extra debugging info to the console

Testing
For testing purposes, you can run the tests/r00tme.sh script, which will create many vulnerable configuration issues on your system that uptux can identify. Running tests/unr00tme.sh will undo these changes, but don't hold me to it. Needless to say, this is dangerous.
Use a VM for testing this way.


ezXSS - An Easy Way For Penetration Testers And Bug Bounty Hunters To Test (Blind) Cross Site Scripting

$
0
0

ezXSS is an easy way for penetration testers and bug bounty hunters to test (blind) Cross Site Scripting.

Current features
Some features ezXSS has
  • Easy to use dashboard with statics, payloads, view/share/search reports and more
  • Payload generator
  • Instant email alert on payload
  • Custom javascript payload
  • Enable/Disable screenshots
  • Prevent double payloads from saving or alerting
  • Block domains
  • Share reports with a direct link or with other ezXSS users
  • Easily manage and view reports in the dashboard
  • Secure your login with extra protection (2FA)
  • The following information is collected on a vulnerable page:
    • The URL of the page
    • IP Address
    • Any page referer (or share referer)
    • The User-Agent
    • All Non-HTTP-Only Cookies
    • All Locale Storage
    • All Session Storage
    • Full HTML DOM source of the page
    • Page origin
    • Time of execution
    • Screenshot of the page
  • its just ez :-)

Required
  • A host with PHP 7.1 or up
  • A domain name (consider a short one)
  • An SSL if you want to test on https websites (consider Cloudflare or Let's Encrypt for a free SSL)

Installation
ezXSS is ez to install
  • Clone the repository and put the files in the document root
  • Create an empty database and provide your database information in 'src/Database.php'
  • Visit /manage/install in your browser and setup a password and email
  • Done! That was ez right?

Demo
For a demo visit demo.ezxss.com/manage with password demo1234. Please note that some features might be disabled in the demo version.

Screenshots







Mallory - HTTP/HTTPS Proxy Over SSH

$
0
0

HTTP/HTTPS proxy over SSH.

Installation
  • Local machine: go get github.com/justmao945/mallory/cmd/mallory
  • Remote server: need our old friend sshd

Configueration

Config file
Default path is $HOME/.config/mallory.json, can be set when start program
mallory -config path/to/config.json
Content:
  • id_rsa is the path to our private key file, can be generated by ssh-keygen
  • local_smart is the local address to serve HTTP proxy with smart detection of destination host
  • local_normal is similar to local_smart but send all traffic through remoteSSH server without destination host detection
  • remote is the remote address of SSH server
  • blocked is a list of domains that need use proxy, any other domains will connect to their server directly
{
"id_rsa": "$HOME/.ssh/id_rsa",
"local_smart": ":1315",
"local_normal": ":1316",
"remote": "ssh://user@vm.me:22",
"blocked": [
"angularjs.org",
"golang.org",
"google.com",
"google.co.jp",
"googleapis.com",
"googleusercontent.com",
"google-analytics.com",
"gstatic.com",
"twitter.com",
"youtube.com"
]
}
Blocked list in config file will be reloaded automatically when updated, and you can do it manually:
# send signal to reload
kill -USR2 <pid of mallory>

# or use reload command by sending http request
mallory -reload

System config
  • Set both HTTP and HTTPS proxy to localhost with port 1315 to use with block list
  • Set env var http_proxy and https_proxy to localhost:1316 for terminal usage

Get the right suffix name for a domain
mallory -suffix www.google.com

A simple command to forward all traffic for the given port
# install it: go get github.com/justmao945/mallory/cmd/forward

# all traffic through port 20022 will be forwarded to destination.com:22
forward -network tcp -listen :20022 -forward destination.com:22

# you can ssh to destination:22 through localhost:20022
ssh root@localhost -p 20022

TODO
  • return http error when unable to dial
  • add host to list automatically when unable to dial
  • support multiple remote servers



Trivy - A Simple And Comprehensive Vulnerability Scanner For Containers, Suitable For CI

$
0
0
A Simple and Comprehensive Vulnerability Scanner for Containers, Suitable for CI.




Abstract
Trivy (tri pronounced like trigger, vy pronounced like envy) is a simple and comprehensive vulnerability scanner for containers. A software vulnerability is a glitch, flaw, or weakness present in the software or in an Operating System. Trivy detects vulnerabilities of OS packages (Alpine, RHEL, CentOS, etc.) and application dependencies (Bundler, Composer, npm, yarn etc.). Trivy is easy to use. Just install the binary and you're ready to scan. All you need to do for scanning is to specify an image name of container.
It is considered to be used in CI. Before pushing to a container registry, you can scan your local container image easily. See here for details.

Features
  • Detect comprehensive vulnerabilities
    • OS packages (Alpine, Red Hat Universal Base Image, Red Hat Enterprise Linux, CentOS, Debian and Ubuntu)
    • Application dependencies (Bundler, Composer, Pipenv, Poetry, npm, yarn and Cargo)
  • Simple
  • Easy installation
    • apt-get install, yum install and brew install is possible (See Installation)
    • No need for prerequirements such as installation of DB, libraries, etc. (The exception is that you need rpm installed to scan images based on RHEL/CentOS. This is automatically included if you use our installers or the Trivy container image. See Vulnerability Detection for background information.)
  • High accuracy
    • Especially Alpine Linux and RHEL/CentOS
    • Other OSes are also high
  • DevSecOps
    • Suitable for CI such as Travis CI, CircleCI, Jenkins, etc.
    • See CI Example

Installation

RHEL/CentOS
Add repository setting to /etc/yum.repos.d.
$ sudo vim /etc/yum.repos.d/trivy.repo
[trivy]
name=Trivy repository
baseurl=https://aquasecurity.github.io/trivy-repo/rpm/releases/$releasever/$basearch/
gpgcheck=0
enabled=1
$ sudo yum -y update
$ sudo yum -y install trivy
or
$ rpm -ivh https://github.com/aquasecurity/trivy/releases/download/v0.1.6/trivy_0.1.6_Linux-64bit.rpm

Debian/Ubuntu
Add repository to /etc/apt/sources.list.d.
 $ sudo apt-get install wget apt-transport-https gnupg lsb-release
$ wget -qO - https://aquasecurity.github.io/trivy-repo/deb/public.key | sudo apt-key add -
$ echo deb https://aquasecurity.github.io/trivy-repo/deb $(lsb_release -sc) main | sudo tee -a /etc/apt/sources.list.d/trivy.list
$ sudo apt-get update
$ sudo apt-get install trivy
or
$ sudo apt-get install rpm
$ wget https://github.com/aquasecurity/trivy/releases/download/v0.1.6/trivy_0.1.6_Linux-64bit.deb
$ sudo dpkg -i trivy_0.1.6_Linux-64bit.deb

Arch Linux
Package trivy-bin can be installed from the Arch User Repository. Examples:
pikaur -Sy trivy-bin
or
yay -Sy trivy-bin

Homebrew
You can use homebrew on macOS.
$ brew install aquasecurity/trivy/trivy

Binary (Including Windows)
Get the latest version from this page, and download the archive file for your operating system/architecture. Unpack the archive, and put the binary somewhere in your $PATH (on UNIX-y systems, /usr/local/bin or the like). Make sure it has execution bits turned on.
You also need to install rpm command for scanning images based on RHEL/CentOS.

From source
$ mkdir -p $GOPATH/src/github.com/aquasecurity
$ cd $GOPATH/src/github.com/aquasecurity
$ git clone https://github.com/aquasecurity/trivy
$ cd trivy/cmd/trivy/
$ export GO111MODULE=on
$ go install
You also need to install rpm command for scanning images based on RHEL/CentOS.

Quick Start
Simply specify an image name (and a tag). The latest tag should be avoided as problems occur with cache.. See Clear image caches.

Basic
$ trivy [YOUR_IMAGE_NAME]
For example:
$ trivy python:3.4-alpine


Result
2019-05-16T01:20:43.180+0900    INFO    Updating vulnerability database...
2019-05-16T01:20:53.029+0900 INFO Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| openssl | CVE-2019-1543 | MEDIUM | 1.1.1a-r1 | 1.1.1b-r1 | openssl: ChaCha20-Poly1305 |
| | | | | | with long nonces |
+---------+------------------+----------+-------------------+---------------+------------------- -------------+

Docker
Replace [YOUR_CACHE_DIR] with the cache directory on your machine.
$ docker run --rm -v [YOUR_CACHE_DIR]:/root/.cache/ aquasec/trivy [YOUR_IMAGE_NAME]
Example for macOS:
$ docker run --rm -v $HOME/Library/Caches:/root/.cache/ aquasec/trivy python:3.4-alpine
If you would like to scan the image on your host machine, you need to mount docker.sock.
$ docker run --rm -v /var/run/docker.sock:/var/run/docker.sock \
-v $HOME/Library/Caches:/root/.cache/ aquasec/trivy python:3.4-alpine
Please re-pull latest aquasec/trivy if an error occurred.


Result
2019-05-16T01:20:43.180+0900    INFO    Updating vulnerability database...
2019-05-16T01:20:53.029+0900 INFO Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| openssl | CVE-2019-1543 | MEDIUM | 1.1.1a-r1 | 1.1.1b-r1 | openssl: ChaCha20-Poly1305 |
| | | | | | with long nonces |
+---------+------------------+----------+-------------------+---------------+------------------- -------------+

Examples

Scan an image
Simply specify an image name (and a tag).
$ trivy knqyf263/vuln-image:1.2.3


Result
2019-05-16T12:58:55.967+0900    INFO    Updating vulnerability database...
2019-05-16T12:59:03.150+0900 INFO Detecting Alpine vulnerabilities...
2019-05-16T12:59:03.156+0900 INFO Updating bundler Security DB...
2019-05-16T12:59:04.941+0900 INFO Detecting bundler vulnerabilities...
2019-05-16T12:59:04.942+0900 INFO Updating cargo Security DB...
2019-05-16T12:59:05.967+0900 INFO Detecting cargo vulnerabilities...
2019-05-16T12:59:05.967+0900 INFO Updating composer Security DB...
2019-05-16T12:59:07.834+0900 INFO Detecting composer vulnerabilities...
2019-05-16T12:59:07.834+0900 INFO Updating npm Security DB...
2019-05-16T12:59:10.285+0900 INFO Detecting npm vulnerabilities...
2019-05-16T12:59:10.285+0900 INFO Updating pipenv Security DB...
2019-05-16T12:59:11.487+0900 INFO Detecting pipenv vulnerabilities...

knqyf263/vuln-image:1.2.3 (alpine 3.7.1)
===== ===================================
Total: 26 (UNKNOWN: 0, LOW: 3, MEDIUM: 16, HIGH: 5, CRITICAL: 2)

+---------+------------------+----------+-------------------+---------------+----------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| curl | CVE-2018-14618 | CRITICAL | 7.61.0-r0 | 7.61.1-r0 | curl: NTLM password overflow |
| | | | | | via integer overflow |
+ +------------------+----------+ +---------------+----------------------------------+
| | CVE-2018-16839 | HIGH | | 7.61.1-r1 | curl: Integer overflow leading |
| | | | | | to heap-based buffer overflow in |
| | | | | | Curl_sasl_create_plain_message() |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2019-3822 | | | 7.61.1-r2 | curl: NTLMv2 type-3 header |
| | | | | | stack buffer overflow |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2018-16840 | | | 7.61.1-r1 | curl: Use-after-free when |
| | | | | | closing "easy" handle in |
| | | | | | Curl_close() |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2018-16842 | MEDIUM | | | curl: Heap-based buffer |
| | | | | | over-read in the curl tool |
| | | | | | warning formatting |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2018-16890 | | | 7.61.1-r2 | curl: NTLM type-2 heap |
| | | | | | out-of-bounds buffer read |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3823 | | | | curl: SMTP end-of-response |
| | | | | | out-of-bounds read |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| git | CVE-2018-17456 | HIGH | 2.15.2-r0 | 2.15.3-r0 | git: arbitrary code execution |
| | | | | | via .gitmodules |
+ +------------------+ + + +----------------------------------+
| | CVE-2018-19486 | | | | git: Improper handling of |
| | | | | | PATH allows for commands to be |
| | | | | | executed from... |
+---------+-- ----------------+----------+-------------------+---------------+----------------------------------+
| libssh2 | CVE-2019-3855 | CRITICAL | 1.8.0-r2 | 1.8.1-r0 | libssh2: Integer overflow in |
| | | | | | transport read resulting in |
| | | | | | out of bounds write... |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2019-3859 | MEDIUM | | | libssh2: Unchecked use of |
| | | | | | _libssh2_packet_require and |
| | | | | | _libssh2_packet_requirev |
| | | | | | resulting in out-of-bounds |
| | | | | | read |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3858 | | | | libssh2: Zero-byte allocation |
| | | | | | with a specially crafted SFTP |
| | | | | | packed leading to an... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3863 | | | | libssh2: Integer overflow |
| | | | | | in user authenticate |
| | | | | | keyboard interactive allows |
| | | | | | out-of-bounds writes |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3862 | | | | libssh2: Out-of-bounds memory |
| | | | | | comparison with specially |
| | | | | | crafted message channel |
| | | | | | request |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3860 | | | | l ibssh2: Out-of-bounds reads |
| | | | | | with specially crafted SFTP |
| | | | | | packets |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3857 | | | | libssh2: Integer overflow in |
| | | | | | SSH packet processing channel |
| | | | | | resulting in out of... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3861 | | | | libssh2: Out-of-bounds reads |
| | | | | | with specially crafted SSH |
| | | | | | packets |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3856 | | | | libssh2: Integer overflow in |
| | | | | | keyboard interactive handling |
| | | | | | resulting in out of bounds... |
+---------+------------------+ +-------------------+---------------+----------------------------------+
| libxml2 | CVE-2018-14567 | | 2.9.7-r0 | 2.9.8-r1 | libxml2: Infinite loop when |
| | | | | | --with-lzma is used allows for |
| | | | | | denial of service... |
+ +------------------+ + + +----------------------------------+
| | CVE-2018-14404 | | | | libxml2: NULL pointer |
| | | | | | dereference in |
| | | | | | xpath.c:xmlXPathCompOpEval() |
| | | | | | can allow attackers to cause |
| | | | | | a... |
+ +------------------+- ---------+ + +----------------------------------+
| | CVE-2018-9251 | LOW | | | libxml2: infinite loop in |
| | | | | | xz_decomp function in xzlib.c |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| openssh | CVE-2019-6109 | MEDIUM | 7.5_p1-r9 | 7.5_p1-r10 | openssh: Missing character |
| | | | | | encoding in progress display |
| | | | | | allows for spoofing of scp... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-6111 | | | | openssh: Impro per validation |
| | | | | | of object names allows |
| | | | | | malicious server to overwrite |
| | | | | | files... |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2018-20685 | LOW | | | openssh: scp client improper |
| | | | | | directory name validation |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| sqlite | CVE-2018-20346 | MEDIUM | 3.21.0-r1 | 3.25.3-r0 | sqlite: Multiple flaws in |
| | | | | | sqlite which can be triggered |
| | | | | | via corrupted internal... |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| tar | CVE-2018-20482 | LOW | 1.29-r1 | 1.31-r0 | tar: Infinite read loop in |
| | | | | | sparse_dump_region function in |
| | | | | | sparse.c |
+---------+------------------+----------+-------------------+---------------+----------------------------------+

ruby-app/Gemfile.lock
=====================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+----------------------+------------------+----------+-------------------+----------- ----+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+----------------------+------------------+----------+-------------------+---------------+--------------------------------+
| rails-html-sanitizer | CVE-2018-3741 | MEDIUM | 1.0.3 | >= 1.0.4 | rubygem-rails-html-sanitizer: |
| | | | | | non-whitelisted attributes |
| | | | | | are present in sanitized |
| | | | | | output when input with |
| | | | | | specially-crafted... |
+----------------------+------------------+----------+- ------------------+---------------+--------------------------------+

rust-app/Cargo.lock
===================
Total: 3 (UNKNOWN: 3, LOW: 0, MEDIUM: 0, HIGH: 0, CRITICAL: 0)

+---------+-------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+-------------------+----------+-------------------+---------------+--------------------------------+
| ammonia | RUSTSEC-2019-0001 | UNKNOWN | 1.9.0 | >= 2.1.0 | Uncontrolled recursion leads |
| | | | | | to abort in HTML serialization |
+---------+-------------------+ +-------------------+---------------+--------------------------------+
| openssl | RUSTSEC-2016-0001 | | 0.8.3 | >= 0.9.0 | SSL/TLS MitM vulne rability due |
| | | | | | to insecure defaults |
+ +-------------------+ + +---------------+--------------------------------+
| | RUSTSEC-2018-0010 | | | >= 0.10.9 | Use after free in CMS Signing |
+---------+-------------------+----------+-------------------+---------------+--------------------------------+

php-app/composer.lock
=====================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+-------------------+------------------+----------+-------------------+---------------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+-------------------+------------------+----------+-------------------+---------------------+--------------------------- -----+
| guzzlehttp/guzzle | CVE-2016-5385 | MEDIUM | 6.2.0 | 6.2.1, 4.2.4, 5.3.1 | PHP: sets environmental |
| | | | | | variable based on user |
| | | | | | supplied Proxy request header |
+-------------------+------------------+----------+-------------------+---------------------+--------------------------------+

node-app/package-lock.json
==========================
Total: 4 (UNKNOWN: 0, LOW: 0, MEDIUM: 3, HIGH: 1, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+---------------- ----------------+
| jquery | CVE-2019-5428 | MEDIUM | 3.3.9 | >=3.4.0 | Modification of |
| | | | | | Assumed-Immutable Data (MAID) |
+ +------------------+ + + +--------------------------------+
| | CVE-2019-11358 | | | | js-jquery: prototype pollution |
| | | | | | in object's prototype leading |
| | | | | | to denial of service or... |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| lodash | CVE-2018-16487 | HIGH | 4.17.4 | >=4.17.11 | lodash: Prototype pollution in |
| | | | | | utilities function |
+ +------------------+----------+ +---------------+ +
| | CVE-2018-3721 | MEDIUM | | >=4.17.5 | |
| | | | | | |
+---------+------------------+----------+-------------------+---------------+--------------------------------+

python-app/Pipfile.lock
=======================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+------------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+------------------------------------+
| django | CVE-2019-6975 | MEDIUM | 2.0.9 | 2.0.11 | python-django: |
| | | | | | memory exhaustion in |
| | | | | | django.utils.numberformat.format() |
+---------+------------------+----------+-------------------+---------------+------------------------------------+


Scan an image file
$ docker save ruby:2.3.0-alpine3.9 -o ruby-2.3.0.tar
$ trivy --input ruby-2.3.0.tar


Result
2019-05-16T12:45:57.332+0900    INFO    Updating vulnerability database...
2019-05-16T12:45:59.119+0900 INFO Detecting Debian vulnerabilities...

ruby-2.3.0.tar (debian 8.4)
===========================
Total: 7447 (UNKNOWN: 5, LOW: 326, MEDIUM: 5695, HIGH: 1316, CRITICAL: 105)

+------------------------------+---------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+------------------------------+---------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| apt | CVE-2019-3462 | CRITICAL | 1.0.9.8.3 | 1.0.9.8.5 | Incorrect sanitation of the |
| | | | | | 302 redirect field in HTTP |
| | | | | | transport method of... |
+ +---------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2016-1252 | MEDIUM | | 1.0.9.8.4 | The apt package in Debian |
| | | | | | jessie before 1.0.9.8.4, in |
| | | | | | Debian unstable before... |
+ +---------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2011-3374 | LOW | | | |
+------------------------------+---------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| bash | CVE-2016-7543 | HIGH | 4.3-11 | 4.3-11+deb8u1 | bash: Specially crafted |
| | | | | | SHELLOPTS+PS4 variables allows |
| | | | | | command substitution |
+ +---------------------+ + +----------------------------------+-----------------------------------------------------+
| | CVE-2019-9924 | | | 4.3-11+deb8u2 | bash: BASH_CMD is writable in |
| | | | | | restricted bash shells |
+ +---------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2016-0634 | MEDIUM | | 4.3-11+deb8u1 | bash: Arbitrary code execution |
| | | | | | via malicious hostname |
+ +---------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2016-9401 | LOW | | 4.3-11+deb8u2 | bash: popd controlled free |
+ +---------------------+ + +----------------------------------+--------------------- --------------------------------+
| | TEMP-0841856-B18BAF | | | | |
+------------------------------+---------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------
...


Save the results as JSON
$ trivy -f json -o results.json golang:1.12-alpine


Result
2019-05-16T01:46:31.777+0900    INFO    Updating vulnerability database...
2019-05-16T01:47:03.007+0900 INFO Detecting Alpine vulnerabilities...

JSON
[
{
"Target": "php-app/composer.lock",
"Vulnerabilities": null
},
{
"Target": "node-app/package-lock.json",
"Vulnerabilities": [
{
"VulnerabilityID": "CVE-2018-16487",
"PkgName": "lodash",
"InstalledVersion": "4.17.4",
"FixedVersion": "\u003e=4.17.11",
"Title": "lodash: Prototype pollution in utilities function",
"Description": "A prototype pollution vulnerability was found in lodash \u003c4.17.11 where the functions merge, mergeWith, and defaultsDeep can be tricked into adding or modifying properties of Object.prototype.",
"Severity": "HIGH",
"References": [
"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-16487",
]
}
]
},
{
"Target": "trivy-ci-test (alpine 3.7.1)",
"Vulnerabilities": [
{
"VulnerabilityID": "CVE-2018-1 6840",
"PkgName": "curl",
"InstalledVersion": "7.61.0-r0",
"FixedVersion": "7.61.1-r1",
"Title": "curl: Use-after-free when closing \"easy\" handle in Curl_close()",
"Description": "A heap use-after-free flaw was found in curl versions from 7.59.0 through 7.61.1 in the code related to closing an easy handle. ",
"Severity": "HIGH",
"References": [
"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-16840",
]
},
{
"VulnerabilityID": "CVE-2019-3822",
"PkgName": "curl",
"InstalledVersion": "7.61.0-r0",
"FixedVersion": "7.61.1-r2",
"Title": "curl: NTLMv2 type-3 header stack buffer overflow",
"Description": "libcurl versions from 7.36.0 to before 7.64.0 are vulnerable to a stack-based buffer overflow. ",
"Severity": "HIGH",
"References": [
"https:/ /curl.haxx.se/docs/CVE-2019-3822.html",
"https://lists.apache.org/thread.html/8338a0f605bdbb3a6098bb76f666a95fc2b2f53f37fa1ecc89f1146f@%3Cdevnull.infra.apache.org%3E"
]
},
{
"VulnerabilityID": "CVE-2018-16839",
"PkgName": "curl",
"InstalledVersion": "7.61.0-r0",
"FixedVersion": "7.61.1-r1",
"Title": "curl: Integer overflow leading to heap-based buffer overflow in Curl_sasl_create_plain_message()",
"Description": "Curl versions 7.33.0 through 7.61.1 are vulnerable to a buffer overrun in the SASL authentication code that may lead to denial of service.",
"Severity": "HIGH",
"References": [
"https://github.com/curl/curl/commit/f3a24d7916b9173c69a3e0ee790102993833d6c5",
]
},
{
"VulnerabilityID": "CVE-2018-19486",
"PkgName": "git",
"InstalledVersion": "2.15.2-r0",
"FixedVersion": "2.15.3-r0",
"Title": "git: Improper handling of PATH allows for commands to be executed from the current directory",
"Description": "Git before 2.19.2 on Linux and UNIX executes commands from the current working directory (as if '.' were at the end of $PATH) in certain cases involving the run_command() API and run-command.c, because there was a dangerous change from execvp to execv during 2017.",
"Severity": "HIGH",
"References": [
"https://usn.ubuntu.com/3829-1/",
]
},
{
"VulnerabilityID": "CVE-2018-17456",
"PkgName": "git",
"InstalledVersion": "2.15.2-r0",
"FixedVersion": "2.15.3-r0",
"Title": "git: arbitrary code execution via .gitmodules",
"Description": "Git before 2.14.5, 2.15.x before 2.15.3, 2.16.x before 2.16.5, 2.17.x before 2.17.2, 2.18.x before 2.18.1, and 2.19.x before 2.19.1 allows remote code execution during processing of a recursive \"git clone\" of a superproject if a .gitmodules file has a URL field beginning with a '-' character.",
"Severity": "HIGH",
"References": [
"http://www.securitytracker.com/id/1041811",
]
}
]
},
{
"Target": "python-app/Pipfile.lock",
"Vulnerabilities": null
},
{
"Target": "ruby-app/Gemfile.lock",
"Vulnerabilities": null
},
{
"Target": "rust-app/Cargo.lock",
"Vulnerabilities": null
}
]


Filter the vulnerabilities by severities
$ trivy --severity HIGH,CRITICAL ruby:2.3.0


Result
2019-05-16T01:51:46.255+0900    INFO    Updating vulnerability database...
2019-05-16T01:51:49.213+0900 INFO Detecting Debian vulnerabilities...

ruby:2.3.0 (debian 8.4)
=======================
Total: 1785 (UNKNOWN: 0, LOW: 0, MEDIUM: 0, HIGH: 1680, CRITICAL: 105)

+-----------------------------+------------------+----------+---------------------------+----------------------------------+-------------------------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+-----------------------------+------------------+----------+---------------------------+----------------------------------+-------------------------------------------------+
| apt | CVE-2019-3462 | CRITICAL | 1.0.9.8.3 | 1.0.9.8.5 | Incorrect sanitation of t he |
| | | | | | 302 redirect field in HTTP |
| | | | | | transport method of... |
+-----------------------------+------------------+----------+---------------------------+----------------------------------+-------------------------------------------------+
| bash | CVE-2019-9924 | HIGH | 4.3-11 | 4.3-11+deb8u2 | bash: BASH_CMD is writable in |
| | | | | | restricted bash shells |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2016-7543 | | | 4.3-11+deb8u1 | bash: Specially crafted |
| | | | | | SHELLOPTS+PS4 variables allows |
| | | | | | command substitution |
+-----------------------------+------------------+ +---------------------------+----------------------------------+-------------------------------------------------+
| binutils | CVE-2017-8421 | | 2.25-5 | | binutils: Memory exhaustion in |< br/>| | | | | | objdump via a crafted PE file |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2017-14930 | | | | binutils: Memory leak in |
| | | | | | decode_line_info |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2017-7614 | | | | binutils: NULL |
| | | | | | pointer dereference in |
| | | | | | bfd_elf_final_link function |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2014-9939 | | | | binutils: buffer overflow in |
| | | | | | ihex.c |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2017-13716 | | | | binutils: Memory leak with the |
| | | | | | C++ symbol demangler routine |
| | | | | | in libiberty |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2018-12699 | | | | binutils: heap-based buffer |
| | | | | | overflow in finish_stab in |
| | | | | | stabs.c |
+-----------------------------+------------------+ +---------------------------+----------------------------------+-------------------------------------------------+
| bsdutils | CVE-2015-5224 | | 2.25.2-6 | | util-linux: File name |
| | | | | | collision due to incorrect |
| | | | | | mkstemp use |
+ +------------------+ + +----------------------------------+-------------------------------------------------+
| | CVE-2016-2779 | | | | util-linux: runuser tty hijack |
| | | | | | via TIOCSTI ioctl |
+-----------------------------+------------------+----------+---------------------------+----------------------------------+-------------------------------------------------+


Filter the vulnerabilities by type
$ trivy --vuln-type os ruby:2.3.0
Available values:
  • library
  • os

Result
2019-05-22T19:36:50.530+0200 [34mINFO[0m Updating vulnerability database...
2019-05-22T19:36:51.681+0200 [34mINFO[0m Detecting Alpine vulnerabilities...
2019-05-22T19:36:51.685+0200 [34mINFO[0m Updating npm Security DB...
2019-05-22T19:36:52.389+0200 [34mINFO[0m Detecting npm vulnerabilities...
2019-05-22T19:36:52.390+0200 [34mINFO[0m Updating pipenv Security DB...
2019-05-22T19:36:53.406+0200 [34mINFO[0m Detecting pipenv vulnerabilities...

ruby:2.3.0 (debian 8.4)
Total: 4751 (UNKNOWN: 1, LOW: 150, MEDIUM: 3504, HIGH: 1013, CRITICAL: 83)

+---------+------------------+----------+-------------------+---------------+----------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------- -----+----------------------------------+
| curl | CVE-2018-14618 | CRITICAL | 7.61.0-r0 | 7.61.1-r0 | curl: NTLM password overflow |
| | | | | | via integer overflow |
+ +------------------+----------+ +---------------+----------------------------------+
| | CVE-2018-16839 | HIGH | | 7.61.1-r1 | curl: Integer overflow leading |
| | | | | | to heap-based buffer overflow in |
| | | | | | Curl_sasl_create_plain_message() |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2019-3822 | | | 7.61.1-r2 | curl: NTLMv2 type-3 header |
| | | | | | stack buffer overflow |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2018-16840 | | | 7.61.1-r1 | curl: Use-after-free when |
| | | | | | closing "easy" handle in |
| | | | | | Curl_close() |
+ +------------------+----------+ +---------------+----------------------------------+
| | CVE-2019-3823 | MEDIUM | | 7.61.1-r2 | curl: SMTP end-of-response |
| | | | | | out-of-bounds read |
+ +------------------+ + + +----------------------------------+
| | CVE-2018-16890 | | | | curl: NTLM type-2 heap |
| | | | | | out-of-bounds buffer read |
+ +------------------+ + +---------------+----------------------------------+
| | CVE-2018-16842 | | | 7.61.1-r1 | curl: Heap-based buffer |
| | | | | | over-read in the curl tool |
| | | | | | warning formatting |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| git | CVE-2018-17456 | HIGH | 2.15.2-r0 | 2.15.3-r0 | git: arbitrary code execution |
| | | | | | via .gitmodules |
+ +------------------+ + + +----------------------------------+
| | CVE-2018-19486 | | | | git: Improper handling of |
| | | | | | PATH allows for commands to be |
| | | | | | executed from... |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| libssh2 | CVE-2019-3855 | CRITICAL | 1.8.0-r2 | 1.8.1-r0 | libssh2: Integer overflow in |
| | | | | | transport read resulting in |
| | | | | | out of bounds write... |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2019-3861 | MEDIUM | | | libssh2: Out-of-bounds reads |
| | | | | | with specially crafted SSH |
| | | | | | packets |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3857 | | | | libssh2: Integer overflow in |
| | | | | | SSH packet processing channel |
| | | | | | resulting in out of... |
+ +-------------- ----+ + + +----------------------------------+
| | CVE-2019-3856 | | | | libssh2: Integer overflow in |
| | | | | | keyboard interactive handling |
| | | | | | resulting in out of bounds... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3863 | | | | libssh2: Integer overflow |
| | | | | | in user authenticate |
| | | | | | keyboard interactive allows |
| | | | | | out-of-b ounds writes |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3862 | | | | libssh2: Out-of-bounds memory |
| | | | | | comparison with specially |
| | | | | | crafted message channel |
| | | | | | request |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3860 | | | | libssh2: Out-of-bounds reads |
| | | | | | with specially crafted SFTP |
| | | | | | packets |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3858 | | | | libssh2: Zero-byte allocation |
| | | | | | with a specially crafted SFTP |
| | | | | | packed leading to an... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-3859 | | | | libssh2: Unchecked use of |
| | | | | | _libssh2_packet_require and |
| | | | | | _libssh2_pack et_requirev |
| | | | | | resulting in out-of-bounds |
| | | | | | read |
+---------+------------------+ +-------------------+---------------+----------------------------------+
| libxml2 | CVE-2018-14404 | | 2.9.7-r0 | 2.9.8-r1 | libxml2: NULL pointer |
| | | | | | dereference in |
| | | | | | xpath.c:xmlXPathCompOpEval() |
| | | | | | can allow attackers to cause |
| | | | | | a... |
+ +------------------+ + + +----------------------------------+
| | CVE-2018-14567 | | | | libxml2: Infinite loop when |
| | | | | | --with-lzma is used allows for |
| | | | | | denial of service... |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2018-9251 | LOW | | | libxml2: infinite loop in |
| | | | | | xz_decomp function in xzlib.c |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| openssh | CVE-2019-6109 | MEDIUM | 7.5_p1-r9 | 7.5_p1-r10 | openssh: Missing c haracter |
| | | | | | encoding in progress display |
| | | | | | allows for spoofing of scp... |
+ +------------------+ + + +----------------------------------+
| | CVE-2019-6111 | | | | openssh: Improper validation |
| | | | | | of object names allows |
| | | | | | malicious server to overwrite |
| | | | | | files... |
+ +------------------+----------+ + +----------------------------------+
| | CVE-2018-20685 | LOW | | | openssh: scp client improper |
| | | | | | directory name validation |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| sqlite | CVE-2018-20346 | MEDIUM | 3.21.0-r1 | 3.25.3-r0 | CVE-2018-20505 CVE-2018-20506 |
| | | | | | sqlite: Multiple flaws in |
| | | | | | sqlite which can be triggered |
| | | | | | via... |
+---------+------------------+----------+-------------------+---------------+----------------------------------+
| tar | CVE-2018-20482 | LOW | 1.29-r1 | 1.31-r0 | tar: Infinite read loop in |
| | | | | | sparse_dump_region function in |
| | | | | | sparse.c |
+---------+------------------+----------+-------------------+---------------+----------------------------------+

Skip update of vulnerability DB
Trivy always updates its vulnerability database when it starts operating. This is usually fast, as it is a difference update. But if you want to skip even that, use the --skip-update option.
$ trivy --skip-update python:3.4-alpine3.9


Result
2019-05-16T12:48:08.703+0900    INFO    Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| openssl | CVE-2019-1543 | MEDIUM | 1.1.1a-r1 | 1.1.1b-r1 | openssl: ChaCha20-Poly1305 |
| | | | | | with long nonces |
+---------+------------------+----------+-------------------+---------------+--------------------------------+


Update only specified distributions
By default, Trivy always updates its vulnerability database for all distributions. Use the --only-update option if you want to name specified distributions to update.
$ trivy --only-update alpine,debian python:3.4-alpine3.9
$ trivy --only-update alpine python:3.4-alpine3.9


Result
2019-05-21T19:37:06.301+0900    INFO    Updating vulnerability database...
2019-05-21T19:37:07.793+0900 INFO Updating alpine data...
2019-05-21T19:37:08.127+0900 INFO Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| openssl | CVE-2019-1543 | MEDIUM | 1.1.1a-r1 | 1.1.1b-r1 | openssl: ChaCha20-Poly1305 |
| | | | | | with long nonces |
+---------+----------------- -+----------+-------------------+---------------+--------------------------------+


Ignore unfixed vulnerabilities
By default, Trivy also detects unpatched/unfixed vulnerabilities. This means you can't fix these vulnerabilities even if you update all packages. If you would like to ignore them, use the --ignore-unfixed option.
$ trivy --ignore-unfixed ruby:2.3.0


Result
2019-05-16T12:49:52.656+0900    INFO    Updating vulnerability database...
2019-05-16T12:50:14.786+0900 INFO Detecting Debian vulnerabilities...

ruby:2.3.0 (debian 8.4)
=======================
Total: 4730 (UNKNOWN: 1, LOW: 145, MEDIUM: 3487, HIGH: 1014, CRITICAL: 83)

+------------------------------+------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+------------------------------+------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| apt | CVE-2019-3462 | CRITICAL | 1.0.9.8.3 | 1.0.9.8.5 | I ncorrect sanitation of the |
| | | | | | 302 redirect field in HTTP |
| | | | | | transport method of... |
+ +------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2016-1252 | MEDIUM | | 1.0.9.8.4 | The apt package in Debian |
| | | | | | jessie before 1.0.9.8.4, in |
| | | | | | Debian unstable before... |
+------------------------------+------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
| bash | CVE-2019-9924 | HIGH | 4.3-11 | 4.3-11+deb8u2 | bash: BASH_CMD is writable in |
| | | | | | restricted bash shells |
+ +------------------+ + +----------------------------------+-----------------------------------------------------+
| | CVE-2016-7543 | | | 4.3-11+deb8u1 | bash: Specially crafted |
| | | | | | SHELLOPTS+PS4 variables allows |
| | | | | | command substitution |
+ +------------------+----------+ + +-----------------------------------------------------+
| | CVE-2016-0634 | MEDIUM | | | bash: Arbitrary code execution |
| | | | | | via malicious hostname |
+ +------------------+----------+ +----------------------------------+-----------------------------------------------------+
| | CVE-2016-9401 | LOW | | 4.3-11+deb8u2 | bash: popd controlled free |
+------------------------------+------------------+----------+----------------------------+----------------------------------+-----------------------------------------------------+
...


Specify exit code
By default, Trivy exits with code 0 even when vulnerabilities are detected. Use the --exit-code option if you want to exit with a non-zero exit code.
$ trivy --exit-code 1 python:3.4-alpine3.9


Result
2019-05-16T12:51:43.500+0900    INFO    Updating vulnerability database...
2019-05-16T12:52:00.387+0900 INFO Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 1 (UNKNOWN: 0, LOW: 0, MEDIUM: 1, HIGH: 0, CRITICAL: 0)

+---------+------------------+----------+-------------------+---------------+--------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+---------+------------------+----------+-------------------+---------------+--------------------------------+
| openssl | CVE-2019-1543 | MEDIUM | 1.1.1a-r1 | 1.1.1b-r1 | openssl: ChaCha20-Poly1305 |
| | | | | | with long nonces |
+---------+------------------+----------+-------------------+---------------+------------------- -------------+


This option is useful for CI/CD. In the following example, the test will fail only when a critical vulnerability is found.
$ trivy --exit-code 0 --severity MEDIUM,HIGH ruby:2.3.0
$ trivy --exit-code 1 --severity CRITICAL ruby:2.3.0

Ignore the specified vulnerabilities
Use .trivyignore.
$ cat .trivyignore
# Accept the risk
CVE-2018-14618

# No impact in our settings
CVE-2019-1543

$ trivy python:3.4-alpine3.9


Result
2019-05-16T12:53:10.076+0900    INFO    Updating vulnerability database...
2019-05-16T12:53:28.134+0900 INFO Detecting Alpine vulnerabilities...

python:3.4-alpine3.9 (alpine 3.9.2)
===================================
Total: 0 (UNKNOWN: 0, LOW: 0, MEDIUM: 0, HIGH: 0, CRITICAL: 0)


Specify cache directory
$ trivy --cache-dir /tmp/trivy/ python:3.4-alpine3.9

Clear image caches
The --clear-cache option removes image caches. This option is useful if the image which has the same tag is updated (such as when using latest tag).
$ trivy --clear-cache python:3.7


Result
2019-05-16T12:55:24.749+0900    INFO    Removing image caches...
2019-05-16T12:55:24.769+0900 INFO Updating vulnerability database...
2019-05-16T12:56:14.055+0900 INFO Detecting Debian vulnerabilities...

python:3.7 (debian 9.9)
=======================
Total: 3076 (UNKNOWN: 0, LOW: 127, MEDIUM: 2358, HIGH: 578, CRITICAL: 13)

+------------------------------+---------------------+----------+--------------------------+------------------+-------------------------------------------------------+
| LIBRARY | VULNERABILITY ID | SEVERITY | INSTALLED VERSION | FIXED VERSION | TITLE |
+------------------------------+---------------------+----------+--------------------------+------------------+-------------------------------------------------------+
| apt | CVE-2011-3374 | LOW | 1.4.9 | | |
+------------------------------+---------------------+ +--------------------------+------------------+-------------------------------------------------------+
| bash | TEMP-0841856-B18BAF | | 4.4-5 | | |
+------------------------------+---------------------+----------+--------------------------+------------------+-------------------------------------------------------+
...


Reset
The --reset option removes all caches and database. After this, it takes a long time as the vulnerability database needs to be rebuilt locally.
$ trivy --reset


Result
2019-05-16T13:05:31.935+0900    INFO    Resetting...


Continuous Integration (CI)
Scan your image built in Travis CI/CircleCI. The test will fail if a vulnerability is found. When you don't want to fail the test, specify --exit-code 0 .
Note: It will take a while for the first time (faster by cache after the second time).

Travis CI
$ cat .travis.yml
services:
- docker

env:
global:
- COMMIT=${TRAVIS_COMMIT::8}

before_install:
- docker build -t trivy-ci-test:${COMMIT} .
- export VERSION=$(curl --silent "https://api.github.com/repos/aquasecurity/trivy/releases/latest" | grep '"tag_name":' | sed -E 's/.*"v([^"]+)".*/\1/')
- wget https://github.com/aquasecurity/trivy/releases/download/v${VERSION}/trivy_${VERSION}_Linux-64bit.tar.gz
- tar zxvf trivy_${VERSION}_Linux-64bit.tar.gz
script:
- ./trivy --exit-code 0 --severity HIGH --no-progress --auto-refresh trivy-ci-test:${COMMIT}
- ./trivy --exit-code 1 --severity CRITICAL --no-progress --auto-refresh trivy-ci-test:${COMMIT}
cache:
directories:
- $HOME/.cache/trivy
Example: https://travis-ci.org/aquasecurity/trivy-ci-test
Repository: https://github.com/aquasecurity/trivy-ci-test

CircleCI
$ cat .circleci/config.yml
jobs:
build:
docker:
- image: docker:18.09-git
steps:
- checkout
- setup_remote_docker
- restore_cache:
key: vulnerability-db
- run:
name: Build image
command: docker build -t trivy-ci-test:${CIRCLE_SHA1} .
- run:
name: Install trivy
command: |
apk add --update curl
VERSION=$(
curl --silent "https://api.github.com/repos/aquasecurity/trivy/releases/latest" | \
grep '"tag_name":' | \
sed -E 's/.*"v([^"]+)".*/\1/'
)

wget https://github.com/aquasecurity/trivy/releases/download/v${VERSION}/trivy_${VERSION}_Linux-64bit.tar.gz
tar zxvf trivy_${VERSION}_Linux-64bit.tar.gz
mv trivy /usr/local/bin
- run:
name: Scan the lo cal image with trivy
command: trivy --exit-code 0 --no-progress --auto-refresh trivy-ci-test:${CIRCLE_SHA1}
- save_cache:
key: vulnerability-db
paths:
- $HOME/.cache/trivy
workflows:
version: 2
release:
jobs:
- build
Example: https://circleci.com/gh/aquasecurity/trivy-ci-test
Repository: https://github.com/aquasecurity/trivy-ci-test

Authorization for Private Docker Registry
Trivy can download images from private registry, without installing Docker and any 3rd party tools. That's because it's easy to run in a CI process.
All you have to do is install Trivy and set ENV vars. But, I can't recommend using ENV vars in your local machine to you.

Docker Hub
Docker Hub needs TRIVY_AUTH_URL, TRIVY_USERNAME and TRIVY_PASSWORD. You don't need to set ENV vars when download from public repository.
export TRIVY_AUTH_URL=https://registry.hub.docker.com
export TRIVY_USERNAME={DOCKERHUB_USERNAME}
export TRIVY_PASSWORD={DOCKERHUB_PASSWORD}

Amazon ECR (Elastic Container Registry)
Trivy uses AWS SDK. You don't need to install aws CLI tool. You can use AWS CLI's ENV Vars.

GCR (Google Container Registry)
Trivy uses Google Cloud SDK. You don't need to install gcloud command.
If you want to use target project's repository, you can settle via GOOGLE_APPLICATION_CREDENTIAL.
# must set TRIVY_USERNAME empty char
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/credential.json

Self Hosted Registry (BasicAuth)
BasicAuth server needs TRIVY_USERNAME and TRIVY_PASSWORD.
export TRIVY_USERNAME={USERNAME}
export TRIVY_PASSWORD={PASSWORD}

# if you want to use 80 port, use NonSSL
export TRIVY_NON_SSL=true

Vulnerability Detection

OS Packages
The unfixed/unfixable vulnerabilities mean that the patch has not yet been provided on their distribution.
OSSupported VersionsTarget PackagesDetection of unfixed vulnerabilities
Alpine Linux2.2 - 2.7, 3.0 - 3.10Installed by apkNO
Red Hat Universal Base Image7, 8Installed by yum/rpmYES
Red Hat Enterprise Linux6, 7, 8Installed by yum/rpmYES
CentOS6, 7Installed by yum/rpmYES
Debian GNU/Linuxwheezy, jessie, stretch, busterInstalled by apt/apt-get/dpkgYES
Ubuntu12.04, 14.04, 16.04, 18.04, 18.10, 19.04Installed by apt/apt-get/dpkgYES
RHEL and CentOS package information is stored in a binary format, and Trivy uses the rpm executable to parse this information when scanning an image based on RHEL or CentOS. The Trivy container image includes rpm, and the installers include it as a dependency. If you installed the trivy binary using wget or curl, or if you build it from source, you will also need to ensure that rpm is available.

Application Dependencies
Trivy automatically detects the following files in the container and scans vulnerabilities in the application dependencies.
  • Gemfile.lock
  • Pipfile.lock
  • poetry.lock
  • composer.lock
  • package-lock.json
  • yarn.lock
  • Cargo.lock
The path of these files does not matter.
Example: https://github.com/aquasecurity/trivy-ci-test/blob/master/Dockerfile

Data source

Usage
NAME:
trivy - A simple and comprehensive vulnerability scanner for containers
USAGE:
trivy [options] image_name
VERSION:
0.1.6
OPTIONS:
--format value, -f value format (table, json) (default: "table")
--input value, -i value input file path instead of image name
--severity value, -s value severities of vulnerabilities to be displayed (comma separated) (default: "UNKNOWN,LOW,MEDIUM,HIGH,CRITICAL")
--output value, -o value output file name
--exit-code value Exit code when vulnerabilities were found (default: 0)
--skip-update skip db update
--only-update value update db only specified distribution (comma separated)
--reset remove all caches and database
--clear-cache, -c clear image caches
--quiet, -q suppress progress bar and log output
--no-progress suppress progress bar
- -ignore-unfixed display only fixed vulnerabilities
--refresh refresh DB (usually used after version update of trivy)
--auto-refresh refresh DB automatically when updating version of trivy
--debug, -d debug mode
--vuln-type value comma-separated list of vulnerability types (os,library) (default: "os,library")
--cache-dir value cache directory (default: "/path/to/cache")
--help, -h show help
--version, -v print the version


Migration
On 19 August 2019, Trivy's repositories moved from knqyf263/trivy to aquasecurity/trivy. If you previously installed Trivy you should update any scripts or package manager records as described in this section.

Overview
If you have a script that installs Trivy (for example into your CI pipelines) you should update it to obtain it from the new location by replacing knqyf263/trivy with aquasecurity/trivy.
For example:
# Before
$ wget https://github.com/knqyf263/trivy/releases/download/v${VERSION}/trivy_${VERSION}_Linux-64bit.tar.gz

# After
$ wget https://github.com/aquasecurity/trivy/releases/download/v${VERSION}/trivy_${VERSION}_Linux-64bit.tar.gz

CentOS/RedHat
Use https://aquasecurity.github.io instead of https://knqyf263.github.io.
$ yum remove trivy
$ sed -i s/knqyf263/aquasecurity/g /etc/yum.repos.d/trivy.repo
$ yum update
$ yum install trivy

Debian/Ubuntu
Use https://aquasecurity.github.io instead of https://knqyf263.github.io.
$ apt-get remove --purge trivy
$ sed -i s/knqyf263/aquasecurity/g /etc/apt/sources.list.d/trivy.list
$ apt-get update
$ apt-get install trivy

Homebrew
Tap aquasecurity/trivy
$ brew uninstall --force trivy
$ brew untap knqyf263/trivy
$ brew install aquasecurity/trivy/trivy

Binary (Including Windows)
No need to fix.

Others

Detected version update of trivy. Please try again with --refresh option
Try again with --refresh option:
$ trivy --refresh alpine:3.9

Unknown error
Try again with --reset option:
$ trivy --reset

Credits

Author
Teppei Fukuda (knqyf263)


Xray - A Tool For Recon, Mapping And OSINT Gathering From Public Networks

$
0
0

XRay is a tool for network OSINT gathering, its goal is to make some of the initial tasks of information gathering and network mapping automatic.

How Does it Work?
XRay is a very simple tool, it works this way:
  1. It'll bruteforce subdomains using a wordlist and DNS requests.
  2. For every subdomain/ip found, it'll use Shodan to gather open ports and other intel.
  3. If a ViewDNS API key is provided, for every subdomain historical data will be collected.
  4. For every unique ip address, and for every open port, it'll launch specific banner grabbers and info collectors.
  5. Eventually the data is presented to the user on the web ui.
Grabbers and Collectors
  • HTTPServer, X-Powered-By and Location headers.
  • HTTP and HTTPSrobots.txt disallowed entries.
  • HTTPScertificates chain ( with recursive subdomain grabbing from CN and Alt Names ).
  • HTMLtitle tag.
  • DNSversion.bind. and hostname.bind. records.
  • MySQL, SMTP, FTP, SSH, POP and IRC banners.

Notes
Shodan API Key
The shodan.io API key parameter ( -shodan-key KEY ) is optional, however if not specified, no service fingerprinting will be performed and a lot less information will be shown (basically it just gonna be DNS subdomain enumeration).
ViewDNS API Key
If a ViewDNS API key parameter ( -viewdns-key KEY ) is passed, domain historical data will also be retrieved.
Anonymity and Legal Issues
The software will rely on your main DNS resolver in order to enumerate subdomains, also, several connections might be directly established from your host to the computers of the network you're scanning in order to grab banners from open ports. Technically, you're just connecting to public addresses with open ports (and there's no port scanning involved, as such information is grabbed indirectly using Shodan API), but you know, someone might not like such behaviour.
If I were you, I'd find a way to proxify the whole process ... #justsaying

Building a Docker image
To build a Docker image with the latest version of XRay:
git clone https://github.com/evilsocket/xray.git
cd xray
docker build -t xraydocker .
Once built, XRay can be started within a Docker container using the following:
docker run --rm -it -p 8080:8080 xraydocker xray -address 0.0.0.0 -shodan-key shodan_key_here -domain example.com 

Manual Compilation
Make sure you are using Go >= 1.7, that your installation is working properly, that you have set the $GOPATH variable and you have appended $GOPATH/bin to your $PATH.
Then:
go get github.com/evilsocket/xray
cd $GOPATH/src/github.com/evilsocket/xray/
make
You'll find the executable in the build folder.

Usage
Usage: xray -shodan-key YOUR_SHODAN_API_KEY -domain TARGET_DOMAIN
Options:
-address string
IP address to bind the web ui server to. (default "127.0.0.1")
-consumers int
Number of concurrent consumers to use for subdomain enumeration. (default 16)
-domain string
Base domain to start enumeration from.
-port int
TCP port to bind the web ui server to. (default 8080)
-preserve-domain
Do not remove subdomain from the provided domain name.
-session string
Session file name. (default "<domain-name>-xray-session.json")
-shodan-key string
Shodan API key.
-viewdns-key string
ViewDNS API key.
-wordlist string
Wordlist file to use for enumeration. (default "wordlists/default.lst")
Example:
# xray -shodan-key yadayadayadapicaboo... -viewdns-key foobarsomethingsomething... -domain fbi.gov

____ ___
\ \/ /
\ RAY v 1.0.0b
/ by Simone 'evilsocket' Margaritelli
/___/\ \
\_/

@ Saving session to fbi.gov-xray-session.json
@ Web UI running on http://127.0.0.1:8080/

License
XRay was made with ♥ by Simone Margaritelli and it's released under the GPL 3 license.
The files in the wordlists folder have been taken from various open source tools accross several weeks and I don't remember all of them. If you find the wordlist of your project here and want to be mentioned, feel free to open an issue or send a pull request.


Sparrow-Wifi - Next-Gen GUI-based WiFi And Bluetooth Analyzer For Linux

$
0
0

Sparrow-wifi has been built from the ground up to be the next generation 2.4 GHz and 5 GHz Wifi spectral awareness tool. At its most basic it provides a more comprehensive GUI-based replacement for tools like inSSIDer and linssid that runs specifically on linux. In its most comprehensive use cases, sparrow-wifi integrates wifi, software-defined radio (hackrf), advanced bluetooth tools (traditional and Ubertooth), traditional GPS (via gpsd), and drone/rover GPS via mavlink in one solution.

[NOTE: Check the Raspberry Pi section for updates. A setup script is now included to get the project running on Raspbian Stretch.]
Written entirely in Python3, Sparrow-wifi has been designed for the following scenarios:
  • Basic wifi SSID identification
  • Wifi source hunt - Switch from normal to hunt mode to get multiple samples per second and use the telemetry windows to track a wifi source
  • 2.4 GHz and 5 GHz spectrum view - Overlay spectrums from Ubertooth (2.4 GHz) or HackRF (2.4 GHz and 5 GHz) in real time on top of the wifi spectrum (invaluable in poor connectivity troubleshooting when overlapping wifi doesn't seem to be the cause)
  • Bluetooth identification - LE advertisement listening with standard bluetooth, full promiscuous mode in LE and classic bluetooth with Ubertooth
  • Bluetooth source hunt - Track LE advertisement sources or iBeacons with the telemetry window
  • iBeacon advertisement - Advertise your own iBeacons
  • Remote operations - An agent is included that provides all of the GUI functionality via a remote agent the GUI can talk to.
  • Drone/Rover operations - The agent can be run on systems such as a Raspberry Pi and flown on a drone (its made several flights on a Solo 3DR), or attached to a rover in either GUI-controlled or autonomous scan/record modes.
  • The remote agent is JSON-based so it can be integrated with other applications
  • Import/Export - Ability to import and export to/from CSV and JSON for easy integration and revisiualization. You can also just run 'iw dev scan' and save it to a file and import that as well.
  • Produce Google maps when GPS coordinates are available for both discovered SSID's / bluetooth devices or to plot the wifi telemetry over time.
A few sample screenshots. The first is the main window showing a basic wifi scan, the second shows the telemetry/tracking window used for both Wifi and bluetooth tracking.

Installation
sparrow-wifi uses python3, qt5, and qtchart for the UI. On a standard debian variant you will may already have python3 and qt5 installed. The only addition to run it is qtchart. The following commands should get you up and running with wifi on both Ubuntu and Kali linux:
sudo apt-get install python3-pip gpsd gpsd-clients python3-tk python3-setuptools
sudo pip3 install QScintilla PyQtChart gps3 dronekit manuf python-dateutil numpy matplotlib
Some folks have been running sparrow with a python virtualenv, if you'd like to run it in an isolated python environment, the following sequence should get you up and running:
git clone https://github.com/ghostop14/sparrow-wifi
cd sparrow-wifi
virtualenv --python=python3 $HOME/sparrow
source $HOME/sparrow/bin/activate
pip3 install gps3 python-dateutil requests pyqt5 pyqtchart numpy matplotlib
sudo python3 sparrow-wifi.py
NOTE: If you're trying to run on a Raspberry Pi, see the Raspberry Pi section below. Only the remote agent has been run on a Pi, some of the GUI components wouldn't install / set up on the ARM platform.

Running sparrow-wifi
Because it needs to use the standard command-line tool 'iw' for wifi scans, you will need to run sparrow-wifi as root. Simply run this from the cloned directory:
sudo ./sparrow-wifi.py

WiFi Notes
One item of note on wifi scanning, especially in the 5 GHz range is to find a card that works. It's not so much an issue with the 'iw' tool, however in more advanced configurations where monitoring mode is required, it can be an issue.

Bluetooth
For folks familiar with WiFi but 'new' to Bluetooth scanning, bluetooth is different enough that some of what you may want to see based on wifi won't be available (and may seem a bit frustrating at first). It all fundamentally comes down to how bluetooth operates. Bluetooth uses frequency hopping across the entire 2.4 GHz range, so it doesn't present in nice clean single channel buckets like wifi does. To complicate things there is a low energy (BTLE) and Classic mode that are incompatible from an RF perspective, so generally a bluetooth adapter can only scan for one type or the other at any given time.
Bluetooth devices are also generally only discoverable when advertising (think broadcasting broadcasting). The only other way to find bluetooth devices is with a device that can sniff all bluetooth packets out of the air, which standard bluetooth adapters don't do. Which is where hardware like an Ubertooth come in to get a better view of the bluetooth environment. And of course then if they're not transmitting you wouldn't have anything to go off of. And if you have to catch packets being transmitted you may need to scan/linger longer to see it, increasing scan frame rates to as long as 30 seconds to a minute.
So with all that said, with a standard / built-in bluetooth adapter, Sparrow-wifi can do advertisement scanning for bluetooth low energy (BTLE) devices. If they're advertising their transmit power, it'll attempt a range calculation. This what the latest iBeacon solutions and products do to be physically locatable. However with multi-pathing, internal walls, etc. don't expect an extreme level of accuracy. As an added bonus, sparrow-wifi can also advertise its own iBeacons for tracking (this could be useful from a remote agent to turn on location discovery). However not all bluetooth cards will advertise transmit power so you may not always get range. If you do have an Ubertooth, sparrow-wifi can use it for promiscuous discovery of both BTLE and classic bluetooth devices. Of course there's a tradeoff. Traditional LE scans update faster for tracking which is easier for bluetooth 'hunt', however promiscuous mode can identify more devices at the expense of needing to linger lon ger to listen.
If you would like to scan for bluetooth, you'll need a few things:
  1. A bluetooth adapter (test with 'hcitool dev' to make sure it shows up). With an adapter you can do basic BTLE advertisement and iBeacon scans.
  2. [Optional ] An Ubertooth for promiscuous discovery scans (BTLE and Classic Bluetooth)
    • Ubertooth tools installed and functioning (you can test it with ubertooth-specan-ui)
    • Blue Hydra installed into /opt/bluetooth/blue_hydra (mkdir /opt/bluetooth && cd /opt/bluetooth && git clone https://github.com/pwnieexpress/blue_hydra.git). Then make sure you've followed the blue_hydra installation instructions. You can test it with bin/blue_hydra. This msut be in /opt/bluetooth/blue_hydra or the app won't find it.
I strongly recommend running 'hcitool lescan' from the command-line first to make sure everything is working okay. If you have an Ubertooth, run ubertooth-specan-ui and run blue_hydra to make sure those tools work properly before attempting in sparrow-wifi.
Some troubleshooting tips:
  • If you don't see any devices with a basic LE advertisement scan, try "hcitool lescan" from the command-line and see if you get any errors. If so address them there. Sometimes a quick "hciconfig hci0 down && hciconfig hci0 up" can fix it.
  • If you have an Ubertooth and don't see any spectrum try running ubertooth-specan or ubertooth-specan-ui from the command line. If you get any errors address them there.

Spectrum
Near real-time spectral overlays in both spectrums is one feature that differentiates sparrow-wifi from other wifi tools. To get spectral overlays, two options are available. The less expensive approach is to use an Ubertooth One. Spectrum scanning is one of the features of the hardware with 1 MHz channel resolution. The downside is that Ubertooth is just focused on the 2.4 GHz spectrum (after all that's where bluetooth functions), so you won't be able to scan the 5 GHz range. If you have more experience and/or hardware focused on software-defined radio (SDR) and have a HackRF One available, while a little more expensive an option, you can scan both the 2.4 (with 0.5 MHz resolution) and 5 GHz (with 2 MHz resolution) spectrum ranges. The next 2 sections provide some details unique to each hardware device. In general the goal of sparrow-wifi were frame rates of about 10 fps local and 5 fps via the remote agent (depending on remote hardware and network connectivity).
The following screenshot shows a 2.4 GHz perspective with an Ubertooth spectrum (with 1 MHz bins) overlay. It's quite interesting to watch the spectrum when bluetooth devices are also active. You can observe the bluetooth channel hopping in the spectrum. There are other protocols such as zigbee and other IoT protocols, even cordless phones that may also show up in the 2.4 GHz spectrum that would not otherwise show up on a wifi-only view. Having the spectral overlay provides an invaluable perspective on other interference in the spectrum for troubleshooting say for instance if no overlapping wireless channels seem to be the source of poor connectivity.

Ubertooth One
Once you get an Ubertooth One, the first thing you should do is download and build the latest tools and flash it with the latest firmware version. With that in place, try running ubertooth-specan-ui for a nice quick graphical spectrum display. If this is working, the Ubertooth should work fine in sparrow-wifi (just close any running Ubertooth tools before attempting to display the spectrum). Sparrow-wifi will automatically detect that the Ubertooth is present and the tools are available on startup and enable the appropriate menu choices. Note that if you start sparrow-wifi without the Ubertooth connected, just close sparrow-wifi and reopen it and it should see it. You can manually test it with lsusb to see that the Ubertooth is present.

HackRF One
HackRF support has been added to take advantage of the hackrf_sweep capabilities added to the HackRF firmware. With a HackRF you can sweep the entire range for a view of the spectrum. While hackrf_sweep can sweep from 2.4 GHz through 5 GHz, the frame rate is too slow (like 1 frame every 2 seconds), so you can use it for only one band at a time. With that said, if you have both an Ubertooth and a HackRF, you could use the Ubertooth to display the 2.4 GHz band and the HackRF to display the 5 GHz band simultaneously.
IMPORTANT: Standard RF and antenna rules apply. If you want to monitor either band, make sure you have an antenna capable of receiving in that band (the standard telescoping HackRF antenna probably won't work as it's only rated up to 1 GHz). And if you do want to grab an external dual-band antenna used on wireless cards, just note that the connector polarity is typically reversed (rp-sma rather than the sma connector on the HackRF) so you'll need to grab an adapter to connect it to the HackRF (they're only a couple dollars on Amazon). An RP-SMA antenna will screw on to the SMA connector but the center pin isn't there so you won't actually receive anything. Just a word of caution.
Notes: The 5 GHz spectrum, even with a dual-band antenna can be difficult to see signals in the same way as in 2.4 GHz. The SNR for 5 GHz seems much lower than 2.4 GHz. Some of this could be attributed to the HackRF as 5 GHz is getting towards the edge of its useable frequency range, while part of it can also be attributed to 5 GHz not penetrating walls, ceilings, etc. as well as 2.4 GHz. Sometimes the 5 GHz band shows better in a waterfall plot to distinguish an active signal, but if that's what you need try the tool qspectrumanalyzer.
Troubleshooting tips:
  • If you don't see any spectrum at all try running hackrf_sweep from the command-line. If you get any errors, address them there.

GPS
Sparrow-wifi relies on gpsd to provide standard GPS communications. During testing there were a number of GPS-related issues worth being aware of. First in terms of GPS receivers, make sure you get one that works with gpsd. I've tested it with a GlobalSAT ND-105C Micro USB receiver. I've also used a GPS app on an android device to provide GPS over bluetooth (although this takes some tinkering, and would preclude using the bluetooth adapter for scanning while using it for GPS).
So the first important note is on the GPS receiver side. If you are planning on using the GPS receiver indoors, you may need to make sure the GPS you get specifically states it will work indoors. Anyone with a Garmin or other outdoor sports GPS system may be aware that they tend to not synchronize with satellites well while indoors. The stock GPS on the Solo 3DR drone is the same way as is the GlobalSAT receiver. When they're close to windows, etc. they may finally sync up after some time, but reception indoors isn't great and if you're in an office building or other metal/concrete structure, the receiver may have a tough time receiving the satellite signals. So keep this in mind when picking a GPS receiver.
In terms of getting the receiver to work with gpsd, there were some challenges that were encountered getting it to work. First, the easiest way to test the gps is to stop the gpsd service (service gpsd stop), and run gpsd from the command-line with debugging enabled. If you have a USB-based GPS you should see a device that looks like /dev/ttyUSB0 show up when it is connected. If that's the case, a command similar to this would start gpsd in the foreground for a quick test:
gpsd -D 2 -N /dev/ttyUSB0
If you see good data, you can daemonize it by just removing the -N parameter. On Ubuntu, editing /etc/default/gpsd and specifically putting /dev/ttyUSB0 in the device parameter and restarting the service worked fine. However on Kali linux and the Raspberry Pi, the same process didn't work as if the gpsd service was ignoring the parameter. In those cases, the GPS service was set to not auto-start and the gpsd daemon was started manually from the command-line with the command 'gpsd /dev/ttyUSB0'.
Once the daemon is up and working, xgps is a tool that's part of the gpsd-clients package that provides a really nice GUI to check GPS and satellite status. If you run xgps it will tell you when the receiver is synchronized and give you a number of other parameters to make sure it is working correctly. If everything looks like it's working with xgps, then sparrow-wifi should be able to pull the data as any other gpsd client would.

Running sparrow-wifi remote agent
Because the agent has the same requirements as the GUI in terms of system access, you will need to run the agent as root as well. Simply run:
sudo ./sparrowwifiagent.py
By default it will listen on port 8020. There are a number of options that can be seen with --help, and a local configuration file can also be used.
An alternate port can also be specified with:
sudo ./sparrowwifiagent.py --port=&lt;myport&gt;
There are a number of options including IP connection restrictions and record-local-on-start. Here's the --help parameter list at this time:
usage: sparrowwifiagent.py [-h] [--port PORT] [--allowedips ALLOWEDIPS]
[--mavlinkgps MAVLINKGPS] [--sendannounce]
[--userpileds] [--recordinterface RECORDINTERFACE]
[--ignorecfg] [--cfgfile CFGFILE]
[--delaystart DELAYSTART]

Sparrow-wifi agent

optional arguments:
-h, --help show this help message and exit
--port PORT Port for HTTP server to listen on
--allowedips ALLOWEDIPS
IP addresses allowed to connect to this agent. Default
is any. This can be a comma-separated list for
multiple IP addresses
--mavlinkgps MAVLINKGPS
Use Mavlink (drone) for GPS. Options are: '3dr' for a
Solo, 'sitl' for local simulator, or full connection
string ('udp/tcp:<ip>:<port>' such as:
'udp:10.1.1.10:14550')
--sendannounce Send a UDP broadcast packet on the specified port to
announce presence
--userpileds Use RPi LEDs to signal state. Red=GPS
[off=None,blinking=Unsynchronized,solid=synchronized],
Green=Agent Running [On=Running, blinking=servicing
HTTP request]
--recordinterface RECORDINTERFACE
Automatically start recording locally with the given
wireless interface (headless mode) in a recordings
directory
--ignorecfg Don't load any config files (useful for overriding
and/or testing)
--cfgfile CFGFILE Use the specified config file rather than the default
sparrowwifiagent.cfg file
--delaystart DELAYSTART
Wait <delaystart> seconds before initializing

Drone / Rover Operations
Being able to "war fly" (the drone equivilent of "wardriving" popular in the wifi world) was another goal of the project. As a result, being able to have a lightweight agent that could be run on a small platform such as a Raspberry Pi that could be mounted on a drone was incorporated into the design requirements. The agent has been flown successfully on a Solo 3DR drone (keeping the overall weight under the 350 g payload weight).
The Solo was a perfect choice for the project because the controller acts as a wifi access point and communicates with the drone over a traditional IP network using the mavlink protocol. This allows other devices such as laptops, tablets, and the Raspberry Pi to simply join the controller wifi network and have IP connectivity. This was important for field operations as it kept the operational complexity down.
Because these drones have onboard GPS as part of their basic functionality, it's possible over mavlink (with the help of dronekit) to pull GPS coordinates directly from the drone's GPS. This helps keep the overall payload weight down as an additional GPS receiver does not need to be flown as part of the payload. Also, in order to keep the number of tasks required by the drone operator to a minimum during flight, the agent can be started, wait for the drone GPS to be synchronized, use the Raspberry Pi lights to signal operational readiness, and automatically start recording wifi networks to a local file. The GUI then provides an interface to retrieve those remotely saved files and pull back for visualization.
This scenario has been tested with a Cisco AE1000 dual-band adapter connected to the Pi. Note though that I ran into an issue scanning 5 GHz from the Pi that I finally found the solution for. With a dual-band adapter, if you don't disable the internal Pi wireless adapter you won't get any 5 GHz results (this is a known issue). What you'll need to do is disable the onboard wifi by editing /boot/config.txt and adding the following line then reboot 'dtoverlay=pi3-disable-wifi'. Now you'll be able to scan both bands from the Pi.
The quickest way to start the agent on a Raspberry Pi (IMPORTANT: see the Raspbery Pi section first, if you're running Raspian Squeeze, you'll need to build Python 3.5 first (Stretch already has 3.5) to run the agent since the subprocess commands used were initially removed from python3 then put back in 3.5) and pull GPS from a Solo drone is to start it with the following command on the Pi:
sudo python3.5 ./sparrowwifiagent.py --userpileds --sendannounce --mavlinkgps 3dr
The Raspberry Pi red and green LED's will then be used as visual indicators transitioning through the following states:
  1. Both lights off - Initializing
  2. Red LED Heartbeat - Connected to the drone (dronekit vehicle connect was successful)
  3. Red LED Solid - Connected and GPS synchronized and operational (the drone can take a couple of minutes for the GPS to settle as part of its basic flight initialization)
  4. Green LED Solid - Agent HTTP server is up and the agent is operational and ready to serve requests
Note: Without the mavlink setting, if using a local GPS module, the red LED will transition through the same heartbeat=GPS present but unsynchronized, solid = GPS synchronized states.
If you don't have a second set of hands while flying your drone and want to fly the Pi without having to worry about the agent, you can start the agent in auto-record mode. There are a few scripts in the scripts directory that start with 'rpi' that can be scheduled for monitoring the agent and starting it as appropriate. The overall intention is a headless configuration where the Pi starts up (you'll need to configure the wifi on the Pi ahead of time to automatically connect to the controller wifi network), the agent will be started and automatically go into wifi record mode using the drone's gps for recording. Once you're done the sparrow-wifi agent menu gives you a screen to manage the files in the recordings directory on the agent and download or delete the files there. These scripts in the scripts directory are just samples. It is highly recommended that you customize them and the Pi integration to meet your specific needs, and by all means keep safety (and federal regula tions) in mind when doing anything with a drone as you're responsible for both.

Raspberry Pi Notes

Raspbian Stretch
Raspbian Stretch now includes the correct version of Python, so no more custom python builds. The only thing that has to be custom handled is that PyQTChart is not in the apt repository or available via pip to build on raspbian. However, thanks to folks over at this thread: https://github.com/mu-editor/mu/issues/441, I've been able to reproduce their pyqtchart build process on Raspbian Stretch. So to make everyone's life easier, there's now a script included with the project called rpi.setup_prerequisites.sh. Sudo that script first, then Sparrow "should" work for you. I tested it on a Pi 3B board with the 7" touchscreen and it works great.

Raspbian Jesse
You can run the remote agent on a Raspberry pi, however the installation requirements are a bit different. First, Python3 did not include some of the subprocess module capabilities in the initial 3.x versions prior to 3.5. However they did put them back in from 3.5 forward. In terms of Raspian builds, Raspbian Squeeze only has Python 3.4.x in the repository. So the first step will be to download and build Python 3.5. However if you're running on Debian Stretch (the latest as of now), you can skip the 3.5 build. The repositories do have Python 3.5.
You can use the following sequence to build python if you need to (you will need to apt-get install libsqlite3-dev prior to building Python since it's built in at compile time now):
sudo apt-get install libsqlite3-dev

cd /tmp
wget https://www.python.org/ftp/python/3.5.5/Python-3.5.5.tgz
tar -zxvf Python-3.5.5.tgz
cd Python-3.5.5
./configure && make -j3 && sudo make install
Once that is done, install the necessary modules into the 3.5 build: sudo pip3.5 install gps3 dronekit manuf python-dateutil
Then you can run the agent directly with commands like this:
/usr/local/bin/python3.5 ./sparrowwifiagent.py

/usr/local/bin/python3.5 ./sparrowwifiagent.py --mavlinkgps=3dr --recordinterface=wlan0
Note that if you forget to specifically start them with 3.5 you will get an exception thrown since a subprocess function will be missing.
Another important note about using dual band USB wireless adapters on the Raspberry Pi (tested on a Pi 3), is that as long as the internal wireless is enabled, Raspbian won't see the 5 GHz band.
Add this line in your /boot/config.txt to disable the internal wireless, then your dual-band USB wireless will be able to see the 5 GHz band:
dtoverlay=pi3-disable-wifi
The red and green LED's are also used on the Raspberry Pi to provide some visual feedback:
  1. Both lights off - Initializing
  2. Red LED Heartbeat - gpsd found but unsynchronized (red light will stay off if gpsd is not installed or not running)
  3. Red LED Solid - gpsd receiver synchronized
  4. Green LED Solid - Agent HTTP server is up and the agent is operational and ready to serve requests


EyeWitness - Tool To Take Screenshots Of Websites, Provide Some Server Header Info, And Identify Default Credentials If Possible

$
0
0

EyeWitness is designed to take screenshots of websites provide some server header info, and identify default credentials if known.
EyeWitness is designed to run on Kali Linux. It will auto detect the file you give it with the -f flag as either being a text file with URLs on each new line, nmap xml output, or nessus xml output. The --timeout flag is completely optional, and lets you provide the max time to wait when trying to render and screenshot a web page.
A complete usage guide which documents EyeWitness features and its typical use cases is available here - https://www.christophertruncer.com/eyewitness-usage-guide/

Supported Linux Distros:
  • Kali Linux
  • Debian 7+ (at least stable, looking into testing) (Thanks to @themightyshiv)
E-Mail: EyeWitness [@] christophertruncer [dot] com

Setup:
  1. Navigate into the setup directory
  2. Run the setup.sh script

Usage:
./EyeWitness.py -f filename --timeout optionaltimeout --open (Optional)

Examples:
./EyeWitness -f urls.txt --web

./EyeWitness -x urls.xml --timeout 8 --headless

Docker
Now you can execute EyeWitness in a docker container and prevent you from install unnecessary dependencies in your host machine.
Note: execute docker run with the folder path in the host which hold your results (/path/to/results)
Note2: in case you want to scan urls from a file, make sure you put it in the volume folder (if you put urls.txt in /path/to/results, then the argument should be -f /tmp/EyeWitness/urls.txt)

Usage
docker build --build-arg user=$USER --tag eyewitness .
docker run \
--rm \
-it \
-e DISPLAY=$DISPLAY \ # optional flag in order to use vnc protocol
-v /tmp/.X11-unix:/tmp/.X11-unix \ # optional flag in order to use vnc protocol
-v /path/to/results:/tmp/EyeWitness \
eyewitness \
EyeWitness_flags_and_input

Example #1 - headless capturing
docker run \
--rm \
-it \
-v ~/EyeWitness:/tmp/EyeWitness \
eyewitness \
--web \
--single http://www.google.com


Github-Dorks - Collection Of Github Dorks And Helper Tool To Automate The Process Of Checking Dorks

$
0
0

Github search is quite powerful and useful feature and can be used to search sensitive data on the repositories. Collection of github dorks that can reveal sensitive personal and/or organizational information such as private keys, credentials, authentication tokens, etc. This list is supposed to be useful for assessing security and performing pen-testing of systems.

GitHub Dork Search Tool
github-dork.py is a simple python tool that can search through your repository or your organization/user repositories. Its not a perfect tool at the moment but provides a basic functionality to automate the search on your repositories against the dorks specified in text file.

Installation
This tool uses github3.py to talk with GitHub Search API.
Clone this repository and run:
pip install -r requirements.txt

Usage
GH_USER  - Environment variable to specify github user
GH_PWD - Environment variable to specify password
GH_TOKEN - Environment variable to specify github token
GH_URL - Environment variable to specify GitHub Enterprise base URL
Some example usages are listed below:
python github-dork.py -r techgaun/github-dorks                          # search single repo

python github-dork.py -u techgaun # search all repos of user

python github-dork.py -u dev-nepal # search all repos of an organization

GH_USER=techgaun GH_PWD=<mypass> python github-dork.py -u dev-nepal # search as authenticated user

GH_TOKEN=<github_token> python github-dork.py -u dev-nepal # search using auth token

GH_URL=https://github.example.com python github-dork.py -u dev-nepal # search a GitHub Enterprise instance

Limitations
  • Authenticated requests get a higher rate limit. But, since this tool waits for the api rate limit to be reset (which is usually less than a minute), it can be slightly slow.
  • Output formatting is not great. PR welcome
  • Handle rate limit and retry. PR welcome

Contribution
Please consider contributing the dorks that can reveal potentially sensitive information in github.

List of Dorks
I am not categorizing at the moment. Instead I am going to just the list of dorks with a description. Many of the dorks can be modified to make the search more specific or generic. You can see more options here.
DorkDescription
filename:.npmrc _authnpm registry authentication data
filename:.dockercfg authdocker registry authentication data
extension:pem privateprivate keys
extension:ppk privateputtygen private keys
filename:id_rsa or filename:id_dsaprivate ssh keys
extension:sql mysql dumpmysql dump
extension:sql mysql dump passwordmysql dump look for password; you can try varieties
filename:credentials aws_access_key_idmight return false negatives with dummy values
filename:.s3cfgmight return false negatives with dummy values
filename:wp-config.phpwordpress config files
filename:.htpasswdhtpasswd files
filename:.env DB_USERNAME NOT homesteadlaravel .env (CI, various ruby based frameworks too)
filename:.env MAIL_HOST=smtp.gmail.comgmail smtp configuration (try different smtp services too)
filename:.git-credentialsgit credentials store, add NOT username for more valid results
PT_TOKEN language:bashpivotaltracker tokens
filename:.bashrc passwordsearch for passwords, etc. in .bashrc (try with .bash_profile too)
filename:.bashrc mailchimpvariation of above (try more variations)
filename:.bash_profile awsaws access and secret keys
rds.amazonaws.com passwordAmazon RDS possible credentials
extension:json api.forecast.iotry variations, find api keys/secrets
extension:json mongolab.commongolab credentials in json configs
extension:yaml mongolab.commongolab credentials in yaml configs (try with yml)
jsforce extension:js conn.loginpossible salesforce credentials in nodejs projects
SF_USERNAME salesforcepossible salesforce credentials
filename:.tugboat NOT _tugboatDigital Ocean tugboat config
HEROKU_API_KEY language:shellHeroku api keys
HEROKU_API_KEY language:jsonHeroku api keys in json files
filename:.netrc passwordnetrc that possibly holds sensitive credentials
filename:_netrc passwordnetrc that possibly holds sensitive credentials
filename:hub oauth_tokenhub config that stores github tokens
filename:robomongo.jsonmongodb credentials file used by robomongo
filename:filezilla.xml Passfilezilla config file with possible user/pass to ftp
filename:recentservers.xml Passfilezilla config file with possible user/pass to ftp
filename:config.json authsdocker registry authentication data
filename:idea14.keyIntelliJ Idea 14 key, try variations for other versions
filename:config irc_passpossible IRC config
filename:connections.xmlpossible db connections configuration, try variations to be specific
filename:express.conf path:.openshiftopenshift config, only email and server thou
filename:.pgpassPostgreSQL file which can contain passwords
filename:proftpdpasswdUsernames and passwords of proftpd created by cpanel
filename:ventrilo_srv.iniVentrilo configuration
[WFClient] Password= extension:icaWinFrame-Client infos needed by users to connect toCitrix Application Servers
filename:server.cfg rcon passwordCounter Strike RCON Passwords
JEKYLL_GITHUB_TOKENGithub tokens used for jekyll
filename:.bash_historyBash history file
filename:.cshrcRC file for csh shell
filename:.historyhistory file (often used by many tools)
filename:.sh_historykorn shell history
filename:sshd_configOpenSSH server config
filename:dhcpd.confDHCP service config
filename:prod.exs NOT prod.secret.exsPhoenix prod configuration file
filename:prod.secret.exsPhoenix prod secret
filename:configuration.php JConfig passwordJoomla configuration file
filename:config.php dbpasswdPHP application database password (e.g., phpBB forum software)
path:sites databases passwordDrupal website database credentials
shodan_api_key language:pythonShodan API keys (try other languages too)
filename:shadow path:etcContains encrypted passwords and account information of new unix systems
filename:passwd path:etcContains user account information including encrypted passwords of traditional unix systems
extension:avastlic "support.avast.com"Contains license keys for Avast! Antivirus
filename:dbeaver-data-sources.xmlDBeaver config containing MySQL Credentials
filename:.esmtprc passwordesmtp configuration
extension:json googleusercontent client_secretOAuth credentials for accessing Google APIs
HOMEBREW_GITHUB_API_TOKEN language:shellGithub token usually set by homebrew users
xoxp OR xoxbSlack bot and private tokens
.mlab.com passwordMLAB Hosted MongoDB Credentials
filename:logins.jsonFirefox saved password collection (key3.db usually in same repo)
filename:CCCam.cfgCCCam Server config file
msg nickserv identify filename:configPossible IRC login passwords
filename:settings.py SECRET_KEYDjango secret keys (usually allows for session hijacking, RCE, etc)
filename:secrets.yml passwordUsernames/passwords, Rails applications
filename:master.key path:configRails master key (used for decrypting credentials.yml.enc for Rails 5.2+)
filename:deployment-config.jsonCreated by sftp-deployment for Atom, contains server details and credentials
filename:.ftpconfigCreated by remote-ssh for Atom, contains SFTP/SSH server details and credentials
filename:.remote-sync.jsonCreated by remote-sync for Atom, contains FTP and/or SCP/SFTP/SSH server details and credentials
filename:sftp.json path:.vscodeCreated by vscode-sftp for VSCode, contains SFTP/SSH server details and credentails
filename:sftp-config.jsonCreated by SFTP for Sublime Text, contains FTP/FTPS or SFTP/SSH server details and credentials
filename:WebServers.xmlCreated by Jetbrains IDEs, contains webserver credentials with encoded passwords (not encrypted!)


Viewing all 5854 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>