Skip to content
This repository has been archived by the owner on Jun 26, 2020. It is now read-only.

Commit

Permalink
Merge "Use sphinx autodoc to generate docs from docstring"
Browse files Browse the repository at this point in the history
  • Loading branch information
Jenkins authored and openstack-gerrit committed Dec 16, 2015
2 parents 092e871 + 222c080 commit 353634f
Show file tree
Hide file tree
Showing 93 changed files with 1,774 additions and 1,699 deletions.
4 changes: 2 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -10,8 +10,8 @@ venv*
build/*
cover/*
.coverage
docs/build/*
doc/build/*
ChangeLog
docs/source/api
doc/source/api
.*.sw?
AUTHORS
37 changes: 37 additions & 0 deletions bandit/plugins/app_debug.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,43 @@
# License for the specific language governing permissions and limitations
# under the License.

r"""
Description
-----------
Running Flask applications in debug mode results in the Werkzeug debugger
being enabled. This includes a feature that allows arbitrary code execution.
Documentation for both Flask [1]_ and Werkzeug [2]_ strongly suggests that
debug mode should never be enabled on production systems.
Operating a production server with debug mode enabled was the probable cause
of the Patreon breach in 2015 [3]_.
Config Options
--------------
None
Sample Output
-------------
.. code-block:: none
>> Issue: A Flask app appears to be run with debug=True, which exposes
the Werkzeug debugger and allows the execution of arbitrary code.
Severity: High Confidence: High
Location: examples/flask_debug.py:10
9 #bad
10 app.run(debug=True)
11
References
----------
.. [1] http://flask.pocoo.org/docs/0.10/quickstart/#debug-mode
.. [2] http://werkzeug.pocoo.org/docs/0.10/debug/
.. [3] http://labs.detectify.com/post/130332638391/how-patreon-got-hacked-publicly-exposed-werkzeug # noqa
.. versionadded:: 0.15.0
"""

import bandit
from bandit.core.test_properties import checks

Expand Down
38 changes: 38 additions & 0 deletions bandit/plugins/asserts.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,44 @@
# License for the specific language governing permissions and limitations
# under the License.

r"""
Description
-----------
This plugin test checks for the use of the Python ``assert`` keyword. It was
discovered that some projects used assert to enforce interface constraints.
However, assert is removed with compiling to optimised byte code (python -o
producing \*.pyo files). This caused various protections to be removed. The use
of assert is also considered as general bad practice in OpenStack codebases.
Please see
https://docs.python.org/2/reference/simple_stmts.html#the-assert-statement for
more info on ``assert``
Config Options
--------------
None
Sample Output
-------------
.. code-block:: none
>> Issue: Use of assert detected. The enclosed code will be removed when
compiling to optimised byte code.
Severity: Low Confidence: High
Location: ./examples/assert.py:1
1 assert logged_in
2 display_assets()
References
----------
- https://bugs.launchpad.net/juniperopenstack/+bug/1456193
- https://bugs.launchpad.net/heat/+bug/1397883
- https://docs.python.org/2/reference/simple_stmts.html#the-assert-statement
.. versionadded:: 0.11.0
"""

import bandit
from bandit.core.test_properties import *

Expand Down
70 changes: 70 additions & 0 deletions bandit/plugins/blacklist_calls.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,76 @@
# License for the specific language governing permissions and limitations
# under the License.

r"""
Description
-----------
A number of Python methods and functions are known to have potential security
implications. The blacklist calls plugin test is designed to detect the use of
these methods by scanning code for method calls and checking for their presence
in a configurable blacklist. The scanned calls are fully qualified and
de-aliased prior to checking. To illustrate this, imagine a check for
"evil.thing()" running on the following example code:
.. code-block:: python
import evil as good
good.thing()
thing()
This would generate a warning about calling `evil.thing()` despite the module
being aliased as `good`. It would also not generate a warning on the call to
`thing()` in the local module, as it's fully qualified name will not match.
Each of the provided blacklisted calls can be grouped such that they generate
appropriate warnings (message, severity) and a token `{func}` may be used
in the provided output message, to be replaced with the actual method name.
Due to the nature of the test, confidence is always reported as HIGH
Config Options
--------------
.. code-block:: yaml
blacklist_calls:
bad_name_sets:
- pickle:
qualnames:
- pickle.loads
- pickle.load
- pickle.Unpickler
- cPickle.loads
- cPickle.load
- cPickle.Unpickler
message: >
Pickle library appears to be in use, possible security
issue.
- marshal:
qualnames: [marshal.load, marshal.loads]
message: >
Deserialization with the {func} is possibly dangerous.
level: LOW
Sample Output
-------------
.. code-block:: none
>> Issue: Pickle library appears to be in use, possible security issue.
Severity: Medium Confidence: High
Location: ./examples/pickle_deserialize.py:20
19 serialized = cPickle.dumps({(): []})
20 print(cPickle.loads(serialized))
21
References
----------
- https://security.openstack.org
.. versionadded:: 0.9.0
"""

import fnmatch

import bandit
Expand Down
130 changes: 129 additions & 1 deletion bandit/plugins/blacklist_imports.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,96 @@
# License for the specific language governing permissions and limitations
# under the License.


import bandit
from bandit.core.test_properties import *


@takes_config
@checks('Import', 'ImportFrom')
def blacklist_imports(context, config):
"""blacklist_imports
A number of Python modules are known to provide collections of
functionality with potential security implications. The blacklist imports
plugin test is designed to detect the use of these modules by scanning code
for `import` statements and checking for the imported modules presence in a
configurable blacklist. The imported modules are fully qualified and
de-aliased prior to checking. To illustrate this, imagine a check for
"module.evil" running on the following example code:
.. code-block:: python
import module # no warning
import module.evil # warning
from module import evil # warning
from module import evil as good # warning
This would generate a warning about importing `module.evil` in each of the
last three cases, despite the module being aliased as `good` in one of
them. It would also not generate a warning on the first import
(of `module`) as it's fully qualified name will not match.
Each of the provided blacklisted modules can be grouped such that they
generate appropriate warnings (message, severity) and a token `{module}`
may be used in the provided output message, to be replaced with the actual
module name.
Due to the nature of the test, confidence is always reported as HIGH
Config Options:
.. code-block:: yaml
blacklist_imports:
bad_import_sets:
- xml_libs:
imports:
- xml.etree.cElementTree
- xml.etree.ElementTree
- xml.sax.expatreader
- xml.sax
- xml.dom.expatbuilder
- xml.dom.minidom
- xml.dom.pulldom
- lxml.etree
- lxml
message: >
Using {module} to parse untrusted XML data is known to
be vulnerable to XML attacks. Replace {module} with the
equivalent defusedxml package.
level: LOW
Sample Output:
.. code-block:: none
>> Issue: Using xml.sax to parse untrusted XML data is known to be
vulnerable to XML attacks. Replace xml.sax with the equivalent
defusedxml package.
Severity: Low Confidence: High
Location: ./examples/xml_sax.py:1
1 import xml.sax
2 from xml import sax
>> Issue: Using xml.sax.parseString to parse untrusted XML data is
known to be vulnerable to XML attacks. Replace xml.sax.parseString with
its defusedxml equivalent function.
Severity: Medium Confidence: High
Location: ./examples/xml_sax.py:21
20 # bad
21 xml.sax.parseString(xmlString, ExampleContentHandler())
22 xml.sax.parse('notaxmlfilethatexists.xml', ExampleContentHandler())
References:
- https://security.openstack.org
.. versionadded:: 0.9.0
"""

checks = _load_checks(config)

# for each check, go through and see if it matches all qualifications
Expand All @@ -36,6 +118,52 @@ def blacklist_imports(context, config):
@takes_config('blacklist_imports')
@checks('Call')
def blacklist_import_func(context, config):
"""blacklist_import_func
This test is in all ways identical blacklist_imports. However, it
is designed to catch modules that have been imported using Python's special
builtin import function, `__import__()`. For example, running a test on the
following code for `module.evil` would warn as shown:
.. code-block:: python
__import__('module') # no warning
__import__('module.evil') # warning
This test shares the configuration provided for the standard
blacklist_imports test.
Sample Output:
.. code-block:: none
>> Issue: Using xml.sax to parse untrusted XML data is known to be
vulnerable to XML attacks. Replace xml.sax with the equivalent
defusedxml package.
Severity: Low Confidence: High
Location: ./examples/xml_sax.py:1
1 import xml.sax
2 from xml import sax
>> Issue: Using xml.sax.parseString to parse untrusted XML data is
known to be vulnerable to XML attacks. Replace xml.sax.parseString with
its defusedxml equivalent function.
Severity: Medium Confidence: High
Location: ./examples/xml_sax.py:21
20 # bad
21 xml.sax.parseString(xmlString, ExampleContentHandler())
22 xml.sax.parse('notaxmlfilethatexists.xml', ExampleContentHandler())
References:
- https://security.openstack.org
.. versionadded:: 0.9.0
"""
checks = _load_checks(config)
if context.call_function_name_qual == '__import__':
for check in checks:
Expand Down
38 changes: 38 additions & 0 deletions bandit/plugins/crypto_request_no_cert_validation.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,44 @@
# License for the specific language governing permissions and limitations
# under the License.

r"""
Description
-----------
Encryption in general is typically critical to the security of many
applications. Using TLS can greatly increase security by guaranteeing the
identity of the party you are communicating with. This is accomplished by one
or both parties presenting trusted certificates during the connection
initialization phase of TLS.
When request methods are used certificates are validated automatically which is
the desired behavior. If certificate validation is explicitly turned off
Bandit will return a HIGH severity error.
Config Options
--------------
None
Sample Output
-------------
.. code-block:: none
>> Issue: [request_with_no_cert_validation] Requests call with verify=False
disabling SSL certificate checks, security issue.
Severity: High Confidence: High
Location: examples/requests-ssl-verify-disabled.py:4
3 requests.get('https://gmail.com', verify=True)
4 requests.get('https://gmail.com', verify=False)
5 requests.post('https://gmail.com', verify=True)
References
----------
- https://security.openstack.org/guidelines/dg_move-data-securely.html
- https://security.openstack.org/guidelines/dg_validate-certificates.html
.. versionadded:: 0.9.0
"""

import bandit
from bandit.core.test_properties import *

Expand Down
Loading

0 comments on commit 353634f

Please sign in to comment.